The Azure SQL Date Warehouse is going to be up and running later this year and is going to give companies a way to store petabytes of data. This will allow the data to be easily consumed by data analyzing software like Microsoft's Power BI tool for data visualization, the Azure Data Factory for data orchestration, or the Azure Machine Learning service.
One thing that makes this data storage service different than the rest is that it has the ability to adjust to fit the amount of data that actually needs to be stored. You can also specify exactly how much processing power you need to be able to analyze the data. The service builds on the parallel processing architecture that Microsoft developed for its SQL Server database.
The company also updated the Azure SQL database service so that customers can pool their Azure database and reduce their storage cost and prepare for new activity. This means that you can manage your storage at a lower cost.
All of this is going to be very useful for running public-facing software services where the amount of space used can fluctuate a whole lot day to day. With most services like this, you'll generally pay for your peak storage space no matter how much of it you are using at the time. This means that you can cut your costs, probably in half, and literally only pay for exactly how much you are using at any given time.
Content originally published here