Cognism collects its data from premium data providers including CrunchBase, Financial Times, S&P Global and hundreds of other premium sources. We also extract from allowed public sources including SEC filings and news articles using proprietary Natural Language Processing technology. Further we operate a system of data verification called DataHelix where our users can submit entries that they believe are outdated, these entries are then checked using DataHelix’s AI and then may be finally manually checked if deemed circumspect.
Cognism does not scrape websites against their robot.txt policy. Please directly e-mail our policy team (email@example.com) if you believe a Cognism server has been used against your servers’ policy, we will investigate the matter promptly. A Cognism service may check your robot.txt file to verify your current policy.
Cognism has verified its services with leading international law firms, our policy team can be contacted for legal notes.