Neural ad evaluation

Cortex

Pressure-test a campaign cut before launch. Cortex shows where attention, memorability, and intent are likely to rise or fall so your team can improve the work before it goes live.

Evaluate

Upload one campaign cut

Upload a cut, get a fast strategic read, and spot what to improve before the campaign goes live.

Input

One exported MP4, MOV, or WebM campaign cut, up to 60 seconds.

Output

A clear first read with the key metrics, followed by the timeline, recommendations, and detailed analysis when you need it.

How it works

Upload video

Drop your ad

Drop in a campaign video and Cortex returns a polished first read your team can act on right away.

Drop a video ad here

or click to browse

MP4, MOV, WebM ยท up to 60 seconds

Expected runtime

Under two minutes for a 30-second spot.

How it works

A brain model, not a survey panel

Cortex runs on TRIBE v2, Meta's foundation model for predicting cortical activity from video, audio, and language simultaneously. Trained on fMRI data from 700+ subjects, it maps brain response across 20,000 cortical vertices per hemisphere. We take that raw neural prediction and distill it into 12 metrics that map to specific decisions a creative team actually makes.

Most pre-launch testing is either too slow (survey panels take days), too shallow (attention heatmaps and facial coding measure where you look, not what the brain does with it), or too expensive to run more than once. Cortex gives you a deep read on every second of a cut, at the cost of a single upload, so you can iterate before production locks in.

TRIBE v2 processes all three modalities through V-JEPA 2, Wav2Vec2-BERT, and Llama 3.2. It placed first at the Algonauts 2025 brain prediction challenge and is fully open-source. The links below point to the original research if you want to inspect the model directly.

Built on Meta's research - if you're from Meta, reach out

Srijit Iyer and Adam Abdalla.