Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

throttle() for async in Python 3.5 #297

Closed
tych777 opened this issue Oct 9, 2017 · 6 comments
Closed

throttle() for async in Python 3.5 #297

tych777 opened this issue Oct 9, 2017 · 6 comments
Assignees

Comments

@tych777
Copy link

tych777 commented Oct 9, 2017

Again I'd like to say 'Thanks!' first to all contributors of this useful project.
I noticed that the throttle() for async is experimental. When do you plan to implement it?
Can I just add to the def of fetch in async\exchange.py?

if self.enableRateLimit:
self.throttle()
self.lastRestRequestTimestamp = self.milliseconds()

Will it work? What consequences may I expect? What workarounds may you suggest to make throttle work for async?

@kroitor kroitor self-assigned this Oct 9, 2017
@kroitor
Copy link
Member

kroitor commented Oct 9, 2017

Hi, @tych777 ! Good to hear again from you!

When do you plan to implement it?

Hopefully, soon. Less than in one week, I think.

Can I just add to the def of fetch in async\exchange.py? Will it work? What consequences may I expect?

Not really, it will break the asynchronous nature of asyncio by halting all concurrent execution. Instead, when waiting for asyncio coroutine we should use asyncio-family wait flavour, to allow for other concurrent execution to continue. In other words, in async we have to do it differently.

What workarounds may you suggest to make throttle work for async?

We should design it with asyncio.queue for requests. Each fetch call should place a request into the queue, and there should be an async loop running somewhere to execute them one by one while staying below rate limit. If you're familiar with JS, take a look at how the poller loop is done there. If not, then, maybe python asyncio docs can help... Does this answer your question?

@kroitor
Copy link
Member

kroitor commented Oct 9, 2017

You can also take a look at how async-generator examples are functioning with an asyncio queue here: https://github.com/ccxt-dev/ccxt/tree/master/examples/py

@kroitor kroitor changed the title throttle() for async throttle() for async in Python Oct 9, 2017
@tych777
Copy link
Author

tych777 commented Oct 10, 2017

@kroitor Thank you for your quick reply and for pointing to the already existing examples. I think the throttle() is more actual for async than synchronous . It'd be nice to have a throttle for an instance of particular exchange, not for just a function. Also it'd be very useful to be able to prioritize and discard specific calls. For example, I'd like to poll tickers as frequently as possible and analyse them constantly. During that analysis I occasionally need to make trades and fetch balances. I'd like to give priorities to specific functions in the queue (fetch_balances = 1 (execute fetch_balances calls first by their order of arrivals), make_trade = 2 ( execute make_trades by their order of arrival after fetch_balances are exhausted), etc.) and discard all old calls of other functions (fetch_tickers('ETH/BTC') , (discard all(if more than one exists in the queue) except the latest one)).

@kroitor
Copy link
Member

kroitor commented Oct 10, 2017

I think the throttle() is more actual for async than synchronous.

I totally agree. We're rushing it as fast as we can.

It'd be nice to have a throttle for an instance of particular exchange, not for just a function.

Yep, the examples just show how I would implement that inside the library for all functions in general. We still need the same concepts, like asyncio queue to do it in the lib.

I'd like to give priorities to specific functions in the queue (fetch_balances = 1 (execute fetch_balances calls first by their order of arrivals), make_trade = 2 ( execute make_trades by their order of arrival after fetch_balances are exhausted), etc.) and discard all old calls of other functions (fetch_tickers('ETH/BTC') , (discard all(if more than one exists in the queue) except the latest one)).

We all have the need for the same, the caveat here is that the exchanges themselves have varying rateLimits depending on which calls you make, so it isn't really easy and straightforward to manage. But we'll do it anyway.

Thx for your feedback!

@kroitor
Copy link
Member

kroitor commented Nov 2, 2017

Ok, this issue was resolved in version 1.9.336. Thx for your patience!
Usage examples were added as well:

It is currently not configured for all exchanges and it mimics the behaviour of the old rate limiter algorithm, that just does uniformly distributed requests once per rateLimit milliseconds. But if configured properly (controlled with tokenBucket exchange property), the new throttle algorithm can handle bursts and variable call costs while preserving the order of calls. This functionality will soon be thoroughly documented in the Manual. The rate limiter options will be configured for all exchanges as well for best performance.

In the meantime, as we now have it in asyncio, I am closing this issue. Feel free to ask if you have any questions or difficulties with it. Thx again for your feedback! Cheers!

@kroitor kroitor closed this as completed Nov 2, 2017
@kroitor
Copy link
Member

kroitor commented Nov 2, 2017

P.S. current implementation is here, your contributions are welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants