Last active
March 29, 2021 14:50
-
-
Save pedrovgp/e5b68c3de385f48b86e6309350184482 to your computer and use it in GitHub Desktop.
Python script to fetch stock prices for gnucash using http://alphavantage.co/ api. Since Quotes::Finance fetching from Yahoo finance no longer works, it is an alternative.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# INSTALL DEPENDENCIES BEFORE RUNNING: pandas, sqlite3, sqlalchemy, requests | |
# There is one thing you have to do though: gnucash needs to know the price's currency and it needs a guid (unique identifier) | |
# for this. This guid is unique for each installation, so you need to find yours and assign it to CURRENCY_GUID. | |
# OR: you can just leave it, and the script will fetch the currency from the last price added | |
import pandas as pd | |
import sqlite3 | |
import sqlalchemy | |
from sqlalchemy import create_engine | |
import datetime | |
import uuid | |
import requests | |
import json | |
import logging | |
logger = logging.getLogger(__name__) | |
# PUT YOUR GNUCASH sqlite3 DATABASE FILENAME HERE. THIS SCRIPT SHOULD BE IN THE SAME FOLDER. | |
file_name='itau.gnucash' | |
engine = create_engine('sqlite:///'+file_name) | |
# | |
CURRENCY_GUID = None # you can hardcode the currency guid for the stock prices you are fetching. Ex.: '12488f5b3fa63a6e8dd1c7ae6bde3875' | |
pd.options.display.float_format = '{:,.2f}'.format | |
pd.options.display.max_columns=200 | |
# GET RELEVANT TABLES | |
logger.info('READING TABLES') | |
prices = pd.read_sql_table('prices', engine) | |
com = pd.read_sql_table('commodities', engine) | |
# THIS IS JUST TO FACILITATE INSPECTION. YOU MAY DELETE IT IF YOU WISH. | |
full = prices.merge(com, how='left', left_on=['commodity_guid'], right_on=['guid'], suffixes=('', '_commodity')) | |
full = full.merge(com, how='left', left_on=['currency_guid'], right_on=['guid'], suffixes=('', '_currency')) | |
apikey='YOUR_ALPHAVANTAGE_API' | |
function='TIME_SERIES_DAILY' | |
rooturl = 'https://www.alphavantage.co/query?' | |
# Guess CURRENCY_GUID by last prices fetched | |
if not CURRENCY_GUID: | |
CURRENCY_GUID = prices.iloc[-1]['currency_guid'] | |
# GET ALL THE TICKERS IN GNUCASH DB BY MNEMONIC NAME. | |
tickers = com[com['namespace']!='CURRENCY']['mnemonic'] | |
historical = {} | |
# FOR EACH TICKER, RETRIEVE INFO ABOUT IT | |
for ticker in tickers: | |
qstr = 'function={0}&symbol={1}&apikey={2}'.format(function,ticker,apikey) | |
historical[ticker]=requests.get(rooturl+qstr) | |
def get_ticker_guid(tic, df=com): | |
'''Given a ticket and the commodities table, retrieve guid of ticker | |
Return -> ticker_guid (str)''' | |
res=df[df['mnemonic']==tic] | |
if res.shape[0]!=1: | |
raise Exception('Length error, could not find tickers guid') | |
return res.iloc[0]['guid'] | |
def parse_response_to_prices(response): | |
'''Should return: | |
metadata_dict, prices_dict_to_append''' | |
prices_dict_to_append = {} | |
prices_dict_to_append['guid'] = uuid.uuid4().hex | |
prices_dict_to_append['currency_guid'] = CURRENCY_GUID #HARDCODED | |
prices_dict_to_append['source'] = 'PVpyScript' | |
b=json.loads(response.content.decode()) | |
last_refreshed = b['Meta Data']['3. Last Refreshed'] | |
tickguid = get_ticker_guid(b['Meta Data']['2. Symbol'], com) | |
prices_dict_to_append['type'] = 'close' | |
prices_dict_to_append['commodity_guid'] = tickguid | |
last_price = b['Time Series (Daily)'][last_refreshed]['4. close'] | |
prices_dict_to_append['value_num'] = int(float(b['Time Series (Daily)'][last_refreshed]['4. close'])*10**4) | |
prices_dict_to_append['value_denom'] = 10**4 | |
date = last_refreshed.replace('-','')+'180800' | |
prices_dict_to_append['date'] = date | |
# print(b['Meta Data']) | |
# print(prices_dict_to_append) | |
return b['Meta Data'], prices_dict_to_append | |
# PROCESS EACH TICKER PRICE AND APPEND TO PRICES TABLE | |
new_prices = prices.iloc[0:0].copy() | |
for k,v in historical.items(): | |
print('Processing %s' %k) | |
try: | |
meta, d = parse_response_to_prices(v) | |
except Exception as e: | |
print('Could not process this ticker, because of the following exception:') | |
print(e) | |
#print(v.content) | |
continue | |
new_prices = new_prices.append(d, ignore_index=True) | |
# APPEND NEW PRICES TO GNUCASH DATABASE | |
new_prices.to_sql('prices', engine, if_exists="append", index=False) |
Awesome script! I created an updated version at https://gist.github.com/rgreen13/643449c327e66cf749a519490a689361 to handle the AlphaVantage API Limit
Thank you for the Script. As Alphavantage does not work or Indian Stock Exchanges, I created a fork which uses another API here. https://gist.github.com/thomasjimmy/3234ee5377693ae926f7e42c8e0038ea
This works for stocks in India.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Thank you for this. It seems to work nicely. But I wonder if lock checking can be added.
I ran a test: while I had the file open, I ran the update script. Then I closed the file and opened it again, and the file seems corrupted.
Can the script check to see if the file is already open and abort if so? I just converted from xml to sql3 so I could use this script, and I no longer see the .LCK files I used to see when the xml file was open. So, I guess I'd better ask those who know better.