// DESKTOP EXPERIENCE

Built for larger screens

EmojiSpace relies on interactive vector maps and semantic clusters that require more space to explore properly.

Please open this lab on a desktop or laptop to continue.

// LAYER 01: THE HOOK

Ever wondered why a tiny typo means you can't find the right emoji, or why searching 'road trip' gives you nothing? Keyword search is a bit of a stickler for rules. Try 'car' first, then try 'road trip' to watch standard search completely lose the plot.

// MiQ AI LABS

EmojiSpace

// KEYWORD / REGEX

The Filing Cabinet

Awaiting input
This only looks for exact labels. No nuance, no context, no vibes. If it's not tagged perfectly, it doesn't exist.
// VECTOR SEARCH

The Spacetime Map

Corpus loading
?
Your idea's coordinate
Emojis with the exact same vibe
// PROJECTED FIELD

The Vector Collider

Awaiting sources
:: A SHARED EMBEDDING SPACE

Pick two concepts to draw their vectors from the origin, mix them into a new point, normalise that direction, and retrieve nearby emojis.

H = normalize((1 - t)A + tB)

By mixing these two concepts, we watch the hybrid coordinate travel through the shared embedding space to reveal a totally new meaning.

Waiting for corpus metadata.

// VECTOR COLLIDER · DESKTOP MODE

The collider needs more room

The projected field, X-Ray labels, and hybrid neighbor map rely on horizontal space. On desktop, this scene becomes a hands-on vector math lab.

Open EmojiSpace on desktop to blend vectors properly and inspect the math.

// NETWORK EXPLORER

Finding the Neighbourhood

Search above to map a concept, then tweak the dials below. Watch how the AI groups similar emojis together to form distinct behavioural clusters.

:: CLUSTER OPTIMIZER
K = 8
Separation 0.000
Best fit K 8
// CLUSTERS · DESKTOP MODE

Best explored on a larger screen

The live cluster map needs more horizontal space to show segment shape, labels, and neighborhood paths clearly.

  • Map a concept into nearby semantic neighborhoods
  • Tune cluster count and compare segment separation
  • Watch related clusters ignite in place

Open EmojiSpace on desktop to use the full cluster explorer.

:: RETRIEVAL-AUGMENTED GENERATION
RAG

Planner version

:: HOW TO SAY IT

:: AT MiQ

Get these terms right and the rest of the story lands faster, because you can explain the mechanism as well as the outcome.

Then land the short recap before opening the Vector Collider.

// RECAP

Three Things to Keep

Before the maths gets wild, here’s the clean takeaway from the tour.

01

Standard search breaks when meaning is implied, fuzzy, visual, or phrased differently.

>That’s why Sigma Audiences can use RAG to interpret an open prompt, retrieve the right audience ingredients, and assemble a synthetic persona without forcing planners through a rigid taxonomy.

02

Modern AI retrieval often works by mapping ideas into vector space and finding nearest neighbours.

>In Sigma, that lets us connect 700T signals into related concepts, use clustering to reveal patterns, and build richer profiles across our Watching, Browsing, and Buying framework.

03

Embeddings turn text, images, and concepts into coordinates inside a shared latent space.

>That shared space helps Sigma move from raw intent to targetable segments, even when the exact words never appear in the underlying data.

Emojis make the idea visible. In practice, the same semantic machinery helps Sigma understand intent, connect signals, and build richer audiences.

Go further into the Vector Collider to see how the maths composes new meaning.