We don’t hand out surveys or ask you to fill up surveys for us.

We collect data from hundreds of job board domains worldwide and consolidate them into a single database. This is where TalentUp gets data.

We have scraped




Job offers





Collect data

Data is collected from over +100 validated web sources.

Web sources include job boards, social networks, and publicly available surveys.

All the data gathered is consolidated into a single database. This is our data source.


Normalize data

Data is standardized for analysis.

Information is found in different languages, fields and can be presented in different ways. In this step we transform the data into standard formats in preparation for further analysis.


Deduplicate data

Duplicate data is removed by cross-checking sources.

Job offers can be published on multiple websites or repeated every few weeks. For this reason, we only include unique data into the database. After extraction of data, TalentUp’s Algorithm is responsible for validating this data.

For example, a software developer can be named 'programador software', 'software engineer' or 'programmer'. All of them refer to the same role. To solve it, we use an advanced model that identifies which role a job profile belongs to.


Validate data

Data is validated by comparing patterns across sources.

After data is normalized, in the next step data is validated to ensure accuracy.

E.g., In order to understand if the salary is accurate, we need more context - seniority level, years of experience, skills required, etc.

With access to this information the job profile is compared with other job profiles in similar companies within the same location. If the salary fits, we compare it with the salaries within the respective company and check whether it fits or not.

With this method new salaries are validated and either included in the database or discarded.