-
-
Notifications
You must be signed in to change notification settings - Fork 198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dukascopy - WARNING or timeout causes missing data #43
Comments
Yes, you should probably remove the line changing the market thread technique. Downloading from DukasCopy seems to work ok from my machine (the main problem is if you increase the threads in the configuration, which can cause server side rejection - in the newest version of the code, I've reduced the number of threads for this reason). You can also try download a day at a time and sleep between calls. |
Thank you for your answer. I tried without using the "thread" or "multiprocess" option. Unfortunately, the problem persisted (at least on my computers) that there will be missing data when the download is retried (I don't know why). If I have already downloaded the data from Dukascopy, is there a possibility in findatapy to just unpack the .bi5 files for a ticker to a csv file, lets say as 1 min candles? That would be a really great functionality! :) |
Hi. Thank you for this great package. However, I experienced a problem: When downloading Dukascopy forex tick data, if there is a:
WARNING - Didn't download on 1 attempt
or
WARNING - Problem downloading.. https://www.dukascopy.com/datafeed/GBPJPY/2012/10/03/14h_ticks.bi5 HTTPConnectionPool(host='datafeed.dukascopy.com', port=80): Max retries exceeded with url: /datafeed/GBPJPY/2012/10/03/14h_ticks.bi5 (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x000001F49D7B09A0>, 'Connection to datafeed.dukascopy.com timed out. (connect timeout=10)')).. will try again 0 occasion
it says "retry". But I noticed that it will cause missing data (in blocks of 1 hour, probably missing 1 h chunk file). It also causes to retry downloading seemingly in a random order also chunk files that were (I assume, as there was no warning) correctly downloaded. For an example:
Do you know what could be the culprit? Could it be the multiprocessing in combination with Windows? (edit: I will try it out with "thread" instead of "multiprocessing" and post back after the test)
Data connection should be fine, I'm in a large institution. Firewall should not be the problem.
For your information, I'm on Windows 11 and using the following code:
Thank you very much for your help.
The text was updated successfully, but these errors were encountered: