Hey,
In order to have a scalable API, we are soon going to implement rate limiting on the API. It means that you will be allowed to make only a certain amount of requests per hour (the value is yet to be defined, but it shoud be around 200-300 requests per hour). In case you reach the limit, you will receive à 400 Bad Request error, and you'll have to wait the begining of the next hour to make requests again.
In the future we will handle special demands so that a popular application (eg. sites like tm-ladder.com) can have a higer rate limit.
Rate limiting is a fairly standard features on public web services APIs. You can check the twitter page for tips about how to avoid beeing rate limited: http://dev.twitter.com/pages/rate-limiting#tips
Rate limiting
Moderator: NADEO
Rate limiting
Please do not PM for support. Instead, create a thread so that everyone can contribute or benefit from the answer!
Re: Rate limiting
Interesting, nice move. You seem to like specifications set by social networks.
Mania Exchange - Share your maps!
ASUS Maximus IV GENE Z / i7 2600K 3.40Ghz QC / 16GB G.Skill Ripjaws DDR3 / GTX 560 Ti
Need technical help for ManiaPlanet? Click here.
ASUS Maximus IV GENE Z / i7 2600K 3.40Ghz QC / 16GB G.Skill Ripjaws DDR3 / GTX 560 Ti
Need technical help for ManiaPlanet? Click here.
Re: Rate limiting
It's not so much that I like them, it's more than those are good practices when it comes to web services. And they've been set by social network, but also by companies like google or amazon (with aws).
Please do not PM for support. Instead, create a thread so that everyone can contribute or benefit from the answer!
- destroflyer
- Posts: 123
- Joined: 16 Jun 2010, 22:17
- Manialink: mlstudio
- Contact:
Re: Rate limiting
Hm... maybe it would be better to cache some of the requests instead of "forbidding" or deny them after a several amount?
- Knutselmaaster
- Posts: 1206
- Joined: 15 Jun 2010, 18:03
- Manialink: intr
- Location: Somewhere between Paris and Disney in France.
- Contact:
Re: Rate limiting
Like with the "old" datafetcher, you can always cache locally i suppose.
Re: Rate limiting
Caching is already done on the server, and as Knutselmaaster pointed out it's a good pratice to cache locally in your application.destroflyer wrote:Hm... maybe it would be better to cache some of the requests instead of "forbidding" or deny them after a several amount?
Rate limiting however allows us to have a more scalable infrastucture because we can more easilly do server dimensioning.
Please do not PM for support. Instead, create a thread so that everyone can contribute or benefit from the answer!
- w1lla
- Posts: 2287
- Joined: 15 Jun 2010, 11:09
- Manialink: maniaplanetblog
- Location: Netherlands
- Contact:
Re: Rate limiting
The formula in the old tmf datafetcher was this±
So its useful to use this for the new providers.
Also in a old version fo tmndatafetcher there was no cache holder so its better to use the code that i justed posted.
Code: Select all
$cachetime = 86400;
$midnight = floor(time()/86400)*86400-3600;
$secondssincemidnight = time()-$midnight;
$nextupdate = floor($secondssincemidnight / $cachetime)*$cachetime + $midnight;
Also in a old version fo tmndatafetcher there was no cache holder so its better to use the code that i justed posted.
TM² Info
SM Info
QM Info
OS: Windows 10 x64 Professional
MB: MSI 970A-G46
Processor: AMD FX-6300 3500 mHz
RAM Memory: 16 GB DDR3
Video: SAPPHIRE DUAL-X R9 280X 3GB GDDR5
KB: Logitech G510s
Mouse: Logitech G300s
Mode Creation
ManiaScript Docs
SM Info
QM Info
OS: Windows 10 x64 Professional
MB: MSI 970A-G46
Processor: AMD FX-6300 3500 mHz
RAM Memory: 16 GB DDR3
Video: SAPPHIRE DUAL-X R9 280X 3GB GDDR5
KB: Logitech G510s
Mouse: Logitech G300s
Mode Creation
ManiaScript Docs
Re: Rate limiting
Rate limiting is now effective on the API. Whenever you make a request on the API, there's a X-Rate-Limit header with an x-www-form-urlencoded string (just like an URI query string, you can decoded it with the http://www.php.net/manual/en/function.parse-str.php php function).
Here's how it looks like:
When the rate limit is reached, you get a "400 Bad Request" error, and the Retry-After header is included in the response:
In the next update of the SDK I'll provide methods to easilly access those values.
Here's how it looks like:
Code: Select all
X-Rate-Limit: current=6&limit=2000&expires_in=3122
Code: Select all
Retry-After: 2888
Please do not PM for support. Instead, create a thread so that everyone can contribute or benefit from the answer!
Who is online
Users browsing this forum: No registered users and 0 guests