Google Launches BigQuery Streaming For Real-Time, Big-Data Analytics
BigQuery, Google’s cloud-based tool for quickly analyzing very large datasets, is getting a massive price cut today (up to 85 percent). But Google is also adding an important new feature that will make it more competitive with the big data service offered by Amazon Kinesis and others. Starting soon, developers will be able to send up to 100,000 rows of real-time data per second to BigQuery and analyze it in near real time.
This now makes BigQuery an option for a whole new range of service that rely on real-time analytics. At its Cloud Platform event today, Google will show how a large public utility can use the system to analyze energy usage data from its customers to find potential outages within minutes, for example. Using BigQuery, the utility can run a query that, for example, looks at data from electrical meters within a few miles around a certain location that didn’t show electrical use in the last five minutes.
Other use cases include handling marketing and financial data, logs and other kinds of metering data.
In the world of big-data analytics, Google’s service is also relatively affordable. On-demand queries will cost $5 per terabyte, and a 5GB/sec reserved query capacity will set you back “only” $20,000 per month. Google argues that’s 75 percent lower than other providers.
Until now, BigQuery featured a very limited streaming feature that could ingest up to 1,000 rows per second, per table. That’s enough for some applications, but not enough to handle the kind of workloads its competitor Amazon Kinesis was designed for, for example. Source