Geocoding Errors

Google will return various error codes depending on the data format or request timing.

Code Moniker Description
200 G_GEO_SUCCESS No errors occurred; the address was successfully parsed and its geocode has been returned.
400 G_GEO_BAD_REQUEST A directions request could not be successfully parsed. For example, the request may have been rejected if it contained more than the maximum number of waypoints allowed.
500 G_GEO_SERVER_ERROR A geocoding, directions or maximum zoom level request could not be successfully processed, yet the exact reason for the failure is not known.
601 G_GEO_MISSING_QUERY The HTTP q parameter was either missing or had no value. For geocoding requests, this means that an empty address was specified as input. For directions requests, this means that no query was specified in the input.
602 G_GEO_UNKNOWN_ADDRESS No corresponding geographic location could be found for the specified address. This may be due to the fact that the address is relatively new, or it may be incorrect.
603 G_GEO_UNAVAILABLE_ADDRESS The geocode for the given address or the route for the given directions query cannot be returned due to legal or contractual reasons.
604 G_GEO_UNKNOWN_DIRECTIONS The GDirections object could not compute directions between the points mentioned in the query. This is usually because there is no route available between the two points, or because we do not have data for routing in that region.
610 G_GEO_BAD_KEY The given key is either invalid or does not match the domain for which it was given.
620 G_GEO_TOO_MANY_QUERIES The given key has gone over the requests limit in the 24 hour period or has submitted too many requests in too short a period of time. If you’re sending multiple requests in parallel or in a tight loop, use a timer or pause in your code to make sure you don’t send the requests too quickly.

OVER_QUERY_LIMIT

About The Problem

Excerpts from: Google Goecoding Strategies

Quota Considerations

Server-side geocoding through the Geocoding Web Service has a quota of 2,500 requests per IP per day, so all requests in one day count against the quota. In addition, the Web Service is rate-limited, so that requests that come in too quickly result in blocking. Client-side geocoding through the browser is rate limited per map session, so the geocoding is distributed across all your users and scales with your userbase. Geocoding quotas and rate limits drive the strategies outlined in this article.

In Google Maps API for Business, quotas are tied to client IDs, which provide much higher quotas. To learn more about Maps API for Business quotas and error handling, we recommend reviewing our article, Usage Limits for Google Maps API Web Services. If you’re still running into quota limits using the Google Maps API for Business, file a support request here:http://www.google.com/enterprise/portal/.

When to Use Server-Side Geocoding

You should be wary of relying on server-side geocoding. The 2,500 request limit is per IP address. All people using your application are sharing your single server. If you are processing requests that come in from a large number of clients, you could easily overload your quota for the day, or even the queries per second from the same IP address. Also, many cloud computing infrastructures, such as Google App Engine or Amazon Web Services, share IP addresses between different applications. Your requests may run up against quota used by other applications wholly outside your control.

Usage limits exceeded

If you exceed the usage limits you will get one of the following status code responses:

  • 620 for the legacy Geocoding V2 web service.
  • OVER_QUERY_LIMIT for all other web services.

This means that the web service will stop providing normal responses and switch to returning only status code OVER_QUERY_LIMIT until more usage is allowed again. This can happen:

  • Within a few seconds, if the error was received because your application sent too many requests per second.
  • Some time in the next 24 hours, if the error was received because your application sent too many requests per day. The time of day at which the daily quota for a service is reset varies between customers and for each API, and can change over time.

Upon receiving a response with status code OVER_QUERY_LIMIT, your application should determine which usage limit has been exceeded. This can be done by pausing for 2 seconds and resending the same request. If status code is still OVER_QUERY_LIMIT, your application is sending too many requests per day. Otherwise, your application is sending too many requests per second.

Shared Servers

If you are on a shared hosting service it is very possible that you will run into the 2500 daily request limit that Google allows for a single server.    The easiest remedy is to rent a dedicated server with a unique IP address.   If this is not a viable option your will need a Google Maps API for Business account.

Resolution

Using The Pro Pack Bulk Import

As of version 3.8.14 of the Pro Pack, the CSV files can include a lat/long field.  If these fields are set the geocoding step is skipped.   If you don’t have data with this info you may have better luck looking up IP addresses manually on a third party service.  To avoid over-loading any specific server I will leave it to you to use Google or Bing or whatever to find a “get my lat/long” website.

Setting Retries

The easiest way to address this is to increase your Retries on the General Settings page in the admin panel. Our plugin will automatically add a delay between each request before re-retrying a failed attempt. Each retry has a successively longer delay. For example:

The easiest way to address this is to increase your Retries on the General Settings page in the admin panel. Our plugin will automatically add a delay between each request before re-retrying a failed attempt. Each retry has a successively longer delay. For example:

Initial Request Fails
Wait 1 second, then retry
2nd Request Fails
Wait 2 seconds, then retry
3rd Request Fails
Wait 3 seconds, then retry
Thus, if you set the retries to 10 you will very likely get all valid addresses encoded. However processing a long list can take quite some time as you could end up waiting for 55 seconds (10+9+8+7+6+5+4+3+2+1) for each address to load if you hit the throttle on the Google servers.

Google Maps API for Business

Google Maps API for Business is a professional service offered by Google for enterprises that rely on data mapping services.  If you are running into over query limit problems this may be your only option unless you rent hosting space on a dedicated server with a unique IP address.  You can learn more about the Google Maps API for Business here:
https://developers.google.com/maps/documentation/business/

The Enterprise API with Google is an option, however it is costly.  This is not going to be an option for most plugin users.   A 1-year pre-paid annual license from Google is about 350 times more expensive than the Pro Pack.   Your least costly alternative is to get your site on a dedicated server with a static (and thus unique to your site) IP address.    At $300/month (or less) this will be a less expensive option.     Most sites will not need to run more than 2,500 geocode requests/day or serve 25,000 map views/day.  If you do, you need Google’s Business Class API and you probably can afford the license.

If anyone MUST HAVE the Business Class API please let me know and I will develop/launch the Store Locator Plus : Enterprise Pack to allow for that option.

Google Maps OEM License

Charleston Software Associates has negotiated a contract with Google to become the first Enterprise License store locator system for WordPress.  

Work will begin later this summer on new add-on packs that allow all Store Locator Plus users to purchase enterprise-class location encoding services at a steeply discounted rate.   I plan to offer a “pay for what you need” service that will be far less costly than the tens-of-thousands of dollars Google charges for a Business class API license.   This will help sites that are loading more than 2500 locations at-a-time on their site as well as sites that are running on a heavily-loaded shared hosting environment that are only able to encode 100-200 locations per day.