Stresstest for your "Big Data"? database in the cloud (Azure Cosmos DB)
Cybrain @ Adobe Stock

Stresstest for your "Big Data" database in the cloud (Azure Cosmos DB)

Will your Azure Cosmos DB withstand a peak of many simultaneous requests or not?

Es wurde kein Alt-Text für dieses Bild angegeben.

With the help of an Azure Cosmos DB, large amounts of data can be stored easily and efficiently in the cloud.

But it is difficult to find the best "Request Units"-setting for the Cosmos DB.

Are 400 RU/s sufficient or does the Cosmos DB possibly have to be set to 5.000 RU/s and more?

Perform a stress test with large amounts of data

To get an idea how much data your Azure Cosmos DB can handle at the same time, I created a stress test application for my customer and put it online as Open Source. Using a CSV file containing 1.5 million records, the application performs save operations in a preset number of 20 threads (concurrent operations).

Es wurde kein Alt-Text für dieses Bild angegeben.

Would you also like to perform a stress test on your Azure Cosmos DB?

It is a good idea to download and install the Azure Cosmos DB emulator from Microsoft on the local developer machine. Now clone the Git repository:

There is a ZIP file in the project directory. This file contains a CSV file containing 1.5 million data records. Extract this CSV file into the directory where your .NET Core console application is running. But you can also use your own CSV file at any time.

Now connect to your Azure Cosmos DB in the source code:

Es wurde kein Alt-Text für dieses Bild angegeben.
Es wurde kein Alt-Text für dieses Bild angegeben.

If you are using the local Cosmos DB emulator, you can find the settings (like the Primary Key) here:

https://localhost:8081/_explorer/index.html

After all settings have been set, execute the application.

Feel free to leave your thoughts on this project.


要查看或添加评论,请登录

Stephan Bail的更多文章

社区洞察

其他会员也浏览了