Tired of running the same crawl over and over again? We automatically schedule your crawl in given intervals and handle the monitoring to make sure that everything still works.
You can run custom-postprocessing pipelines on every collected item. This allows for instant e-mail notification on critical events or data ingestion into databases.
Data is worthless without any insights from it. Thus, we allow you to perform analysis on your collected data. Either with Jupyter Notebooks or your custom system.
We constantly monitor the European newspaper ecosystem. This allows us to compare the European countries with each other and detect and react to events.
We performed a price analysis for the vehicle market to calculate the actual loss in value for different cars.
With molescrape, we were also able to develop a custom search engine for a blog that gets updated automatically. The blog did not require any code changes at all.
After the Fukushima incident in Japan in 2011, the founder of molescrape collected the publicly available radiation information from different government agencies and combined them in charts.