Frequently Asked Questions

What is Entropik’s Emotion AI?
What Entropik's Emotion AI is able to detect?
How does our technology recognize emotions?
What are the benefits of using Entropik's Emotion AI client-side technology?
What industries can use Entropik's integrated research platform?
When should we conduct user purchase behavior – at the end or beginning of the month?
Do you work directly with brands or other media agencies?
Can we measure creative efficacy post launching it in live environment?
Can the platform help conduct end-to-end automated UX research, specifically usability testing?
Is conducting moderated and unmoderated user testing, and being able to see and hear the user go through the prototype in certain cases possible?
Is competition benchmarking on internet banking where logging in is required possible?
How to conduct studies using a CCTV camera? What about the associated accuracy level?
How can I optimize videos based on the findings, once they are already shot?
How does accuracy differ between a mobile camera and a laptop camera?
Is the calibration process the same for a mobile study for Eye Tracking or Facial Coding?
How does the AI differentiate between the visual and audio components being tested?
What does ‘30 frames per second’ mean?
Will participants’ responses be subjective to their mental state of mind at the time of taking the test?
Are the responses accurate given they are being paid to participate?
How is the authenticity of the panelists ensured? 
Is the research methodology aided or unaided?
How are Benchmarks calculated?
Is it possible to tap into many emerging markets like India, Indonesia, Mexico, and Chile for research participants? Is it also possible to use our own database?
Will there be any kind of support extended by Entropik pre & post-campaign execution?

Entropik does not ask for or collect any personal information as a part of our data curation. As a part of our Facial Coding technology, we ask users for their explicit consent to turn on the camera prior to displaying the stimuli. A clear demonstration of how the camera will capture the data is given beforehand. All frontal face data features are extracted, and the videos are deleted as soon as the technical features required for emotion prediction are extracted in the data processing pipeline.

Our privacy policy details all the information on what kind of data we collect, how we collect it, and where we store it. We also provide means for users to have their nonidentifiable data to be removed from our database. Database access is strictly restricted, and the developers and other personnel within the company do not have access to the production database or the production environment.

Partner with us to Humanize Experiences

Book a Demo

First name*
  • This is some text inside of a div block.
Last name*
  • This is some text inside of a div block.
Please use your business email address
Business Email*
  • This is some text inside of a div block.
Phone number*
  • This is some text inside of a div block.
Job title*
  • This is some text inside of a div block.
Company name*
  • This is some text inside of a div block.
Country*
  • This is some text inside of a div block.
Demo Preference*
  • This is some text inside of a div block.
Country*
  • This is some text inside of a div block.
Oops! Something went wrong while submitting the form.

Book a Demo

Thank You!

We will contact you soon.