Bitcoin Daily Volume is $0.5T, Can't Be True

The daily volume for bitcoin has passed the half-trillion mark. This can’t be true, or is it?
dump.json (1.5 KB)

Hi, thanks for reporting the issue, it’s a bug for sure and we are aware of it and working on a fix.

Hi, after communicating from the dev team this was an isolated incident for BTC.

Originally, I thought it was the same as this incident: Something strange happens to ETH daily volume at midnight GMT

However we determined they were separate.

Thanks again for reporting.

1 Like

Thanks for following up. There is one more issue with the “/currencies/ticker” API endpoint. It sometimes inserts a line break character “\n” at a random location. This breaks the JSON when it comes to parsed. I solved it by removing all line break and carriage return characters.

Thanks we’ll check it out. Any particular query params that consistently cause this?

I am afraid I can’t provide any supporting file to back this but it was happening randomly and when it was adding a line break after the decimal point and before the decimal digits it fails the JSON parser, this is how I came to know of its existence. I don’t have the JSON dump as I was deleting them periodically. However, I can run my script without stripping the line break characters to catch it for you.

Oh, it was within a numeric field? That’s very odd. We haven’t seen this in our own usage, so any reproduction or snapshot would be extremely helpful.

dump.json.json (772.3 KB) Check position 1531153. I compressed the file and changed the extension so it can pass the filter. Please revert the extension back to zip.

Please note that it happens randomly, so I will be able to retrieve and parse many times (spaced 10 seconds each) until I face this problem.

I used the following PHP code to retrieve and parse:
$url = ‘https://api.nomics.com/v1/currencies/tickerkey=mykey&interval=1h,1d,7d,30d,365d’;
$file = file_get_contents($url);
$json = json_decode($file, true, 512, JSON_OBJECT_AS_ARRAY);

Thanks for the extra info. We’re going to write a similar script and investigate.

Hello,

We’re pretty sure the issue is due to HTTP Chunked Transfer Encoding, which sends a CRLF between chunks. This is known to be a bug with file_get_contents:

https://bugs.php.net/bug.php?id=47759

I am not familiar with PHP, so I can’t give you an exact solution, but I would recommend looking for a better way to fetch the URL that supports HTTP chunking. Let me know if that works.

Thanks for following up. I had the same suspicion and I moved to use CURL and the issue doesn’t occur.