• UX for AI
  • Posts
  • AI-Search Part 2: "Eye Meat" and DOI Sort Algorithms

AI-Search Part 2: "Eye Meat" and DOI Sort Algorithms

Dynamic Dashboards (“Eye Meat”) and DOI (Degree of Interest) Sort Algorithms -- two critical applications of AI for sorting and displaying large quantities of content in a way that relates to the interests of a particular customer.

NOTE: This article is the continuation of Part 1: Search UX Revolution: LLM AI in Search UIs: https://www.uxforai.com/p/search-ux-revolution-llm-ai-in-search-uis

What are Dynamic Dashboards?

Dynamic Dashboards are not a new concept. Edward Tufte called these constructs “Visual Confections,” and John Maeda called these creations “Eye Meat” (super-Halloween appropriate, IMHO). Regardless of what you call them, these visual dashboards are the platform on which most of our digital experience unfolds. 

Let me just start by saying —

Figuring out what a particular customer wants to look at next is a tough problem.

An excellent example comes from Amazon.com. Note all of the red arrows — those are the items that the algorithm got wrong:

Image Source: Amazon.com

This home screen tells a nuanced story of my frequent shopping habits and likely next purchases. Just like Suggestions and Next Steps we will be covering shortly as part of the LLM Patterns series: https://www.uxforai.com/p/llm-design-patterns-re-stating-auto-complete-talk-back-suggestions-nest-steps-regen-tweaks-and-guard where the AI is used to infer the next question, this Amazon page is also constructed with the help of AI. Each separate section or “rubric” has a slightly different, independent algorithm. This is why some of the sections have somewhat repetitive information — the sneaker poster, for example. And while I’m a fan of Alita, how many teen robot mangas does Amazon think I’m going to be reading? (A lot, apparently. Four. Srsly. Four is too many. Maybe… I think.)

Some other stuff Amazon clearly got wrong here:

  1. I don’t own any cats, never buy any cat food, and never will. I’m a dog person. Period.

  2. I’m not a big fan of poached eggs. I eat them once a year. The likelihood of me buying the egg poaching appliance is… Well, let’s just say it’s highly unlikely.

  3. While I travel a great deal for various conferences, I have no plans to go to Prague (Maybe Amazon knows something I don’t… Hey, Prague UX peeps, I’m available for UX for AI Workshops!)

  4. How many sets of 5 (FIVE!) belts, garden furniture sets, and V for Vendetta posters does Amazon think I need?

The algorithm is not perfect, as the popularity of the following X (Twitter) post below attests. Clearly, @GirlFromBlupo struck a nerve: 

Image Source: X 

Another example of a visual dashboard comes from Google. Here’s the result of the search for Jungle Book on mobile and desktop, both done from the same Google account:

Jungle Book Search on Mobile and Desktop. Image Source: Google.com

This query is ambiguous: this original tale by Rudyard Kipling has spawned multiple movie adaptations and related tales, so the user could be asking for anything in the Jungle Book universe. However, Google Mobile quite deliberately focuses on the 2016 movie The Jungle Book by Jon Favreau. Equally bizarre is the fact that the exact same query in Google Desktop yields the 1967 Disney Classic version by Wolfgang Reitherman!

Also worth pointing out is the utterly bizarre choice of a French-language ad for the 1967 film under the name “Le Livre de la Jungle” (“Jungle Life.”) It’s the only video link that appears in the product tile on the right.

The entire page is constructed using Google’s proprietary algorithm, including the choice of layout, “See also” section, product tile, trending section, etc. Thus, the “visual smorgasbord” dashboard approach is often deliberately vague and avoids obeying strict rules, which makes it the perfect playground for AI. Here on the Jungle Book landing page, our robot overlords give us “movies about bears,” which makes a weird sort of sense machine sense… all the same, the row has brilliant, highly relevant content.

Movies about bears. Image Source: Google.com

Unfortunately, this “AI vagueness” sometimes backfires. Bigly.

Beware of Bias in AI Recommendations 

Here’s an example of the query “presidential candidates” performed circa September 2016: 

Presidential Candidates. Image Source: Google.com. Collected September 2016

As you can see, even this close to the general election on November 5th, Google displays not two but three major candidates. By this time, both Donald Trump and Hillary Clinton had firmly secured their party’s nominations, yet Google stubbornly refused to remove Bernie Sanders from the lineup.

But at least Hillary Clinton actually appeared in the lineup.

For the 2024 election this year, we find the situation is much worse. Here are Google’s results collected September 8, 2024 (around the same time as the 2016 image above):

Presidential Candidates. Image Source: Google.com Collected September 8th, 2024

The first page is dominated by Bush, who is “unwilling to endorse a presidential candidate.” The second page likewise features no images of Kamala Harris. This is despite Kamala having secured the Democratic Party nomination weeks before, on August 2, 2024, according to the Washington Post: https://www.washingtonpost.com/politics/2024/08/02/harris-becomes-democratic-nominee/ 

But the fun doesn’t stop there. Navigating to the Images tab, we find not a single example of Kamala Harris facing off against her opponent, Donald Trump:

Presidential Candidates Image Search. Image Source: Page 1 of Google.com Collected September 8th, 2024

Until… Page 6! Right next to the heated race for the president of the American Library Association and Mohammad Bagher Ghalibaf and Saeed Jalili — running for the presidential election in IRAN!

Presidential Candidates Image Search. Image Source: Page 6 of Google.com Collected September 8th, 2024

This is an excellent example of the urgent and critical need to be keenly aware of AI bias whenever you are using AI to construct your own visual dashboards or to sort your search results. (NOTE: We previously covered the subject of AI bias here: https://www.uxforai.com/p/wmd-white-male-default-bias-ai and here: https://www.uxforai.com/p/transforming-ai-bias-into-augmented-intelligence)

DOI: Degree of Interest/Sort Algorithms 

Whether content appears on the first page or is relegated to page 6 (or page 1006) of your search results is determined by the DOI—Degree of Interest Algorthims, which control the sort order of items as displayed to the user. While the in-depth discussion of the subject is well beyond the scope of this article, we should mention at least a few salient points.

To give a relatively straightforward example, let’s say you are in charge of trying to decide whether to feature a particular new topic that is getting some decent traffic as part of your website’s default “Sort by: Popular” sort order. Here’s a graph of web views or shares of this topic over time for this new topic:

Example of Trending Topic Views over Time. Image Source: Greg Nudelman

Because anything might appear as “high growth” when you start from 0, first, we have to define a certain minimum number of views to make a topic worthy of consideration, say 1,000. That number is the minimum threshold to give “legitimacy” to the topic. At this point, we can apply the algorithm to tease out the slope of the curve, which will give us an idea of whether a topic's popularity is growing, shrinking, or remaining the same.

The higher the slope, the “hotter” and more “trendy” the topic is. If the slope is beyond a specific number, say higher than 1 (45 degrees), the algorithm might consider it a hit; the topic is then said to be “trending.” 

So, using this graph, this particular topic only becomes “trending” after the 3rd day.

However, this entire previous graph can be just a blip compared to a graph of traffic to a more prominent, perhaps more important and enduring topic:

Trending Topic Jutaxposed on the Enduring Topic Views. Image Source: Greg Nudelman

A good sorting algorithm should be able to capture new developments while at the same time keeping an eye out for consistently high-performing topics their readers continue to be interested in. Looking at the graph, it is easy to see why this might not be an easy task. Note that the dark blue line of the established topic dwarves the smaller gray line of the new topic. That is why the sorting algorithm often makes special allowances, emphasizing the most recent “trending” items. In effect,

Typical sort normally has two or more different algorithms working together (competing) to determine the overall sorting order, which appears as a single continuous list to the customer.

That is why discussions concerning sort order, trending, and DOI corves so often involve highly secret proprietary algorithms, as each company strives to add its own “secret trending sauce” to the mix.

When helping your team design an algorithm for sorting, get curious and don’t be afraid to ask tough questions, such as how the particular selection is made, how many algorithms there are, and what the top items are represented by each algorithm. Take the time to understand how your company makes money and how the sort algorithm helps your organization achieve success.

Recall the critical role that Facebook’s sort algorithm played in spreading lies that fomented the violence of the January 6, 2020, United States Capitol attack. (1, 2, 3)

The recent NPR article: New study shows just how Facebook's algorithm shapes conservative and liberal bubbles by Huo Jingnan and Shannon Bond, published July 27, 2023, underscores the complexity of the problem. (4)

Summarizing various studies to date, the article states that there is “strong evidence that when it comes to politics, the Facebook algorithm is biased towards the extremes." 

The studies found that on Facebook, liberals and conservatives live in their own political news bubbles more so than elsewhere online. On average, about half the posts users see come from like-minded sources. One out of five users experience an echo chamber on the platform, where at least three-quarters of the posts they see come from ideologically aligned sources.

They also show that changing the platform's algorithm substantially changes what people see and how they behave on the site. A three-month study of:

Users who were moved to a simple sort in reverse chronological order (without any algorithmic ranking) significantly affected how they used the platform: they posted less about politics, liked political content less, and were less likely to share that they voted or mention politicians and candidates for office. 

Source: New study shows just how Facebook's algorithm shapes conservative and liberal bubbles by Huo Jingnan and Shannon Bond, published July 27, 2023

However – and this is the key:

“Getting rid of the algorithmically driven feed also curtailed the amount of time people spent on the platform, sending them to Instagram.”

Source: New study shows just how Facebook's algorithm shapes conservative and liberal bubbles by Huo Jingnan and Shannon Bond, published July 27, 2023

(Greg’s aside: One can’t help but be strongly reminded of people addicted to drugs, who, not finding their drug of choice from their usual dealer, move on down the street...)

And less time and less engagement on the platform means less money. And we are talking about a lot of money.

The article warns that: “Changing Facebook's algorithm to reduce engagement would have significant business implications. The systems serve up content they predict will keep users clicking, liking, commenting, and sharing — creating an audience for the advertising that generates nearly all of Meta's $116.6 billion in annual revenue.”

Perhaps the best summary is offered by Chris Bail, director of Duke University's Polarization Lab, who is quoted in the article saying:

"We need many, many more studies before we can come up with these types of sweeping statements about Facebook's impact on democracy, polarization, the spread of misinformation, and all of the other very important topics that these studies are beginning to shed light on… We all want this to be a referendum on, is Facebook good or bad… But it's not."

One thing is clear: 

The DOI sort algorithms and dynamic dashboards are the platform on which much of our digital experience unfolds. The importance of getting UX involved in understanding the AI algorithms driving the creation of these dynamic constructs and their results on customer behaviours cannot be overstated. 

It is time for UX Designers to get involved in AI-driven products. That means learning a bit about basic analytics methods and user outcomes so we can be valuable contributors to the technical discussions and product strategy.

Here, at UX for AI, this is precisely what we strive to enable designers to do. So, thank you for being a subscriber — and please keep the questions coming, and we’ll do our best to cover them in future editions.

And don’t forget to vote on November 5th,

Greg

References

  1. How Facebook played a role in the Jan. 6 Capitol riot in Washing Post, dated Oct 22, 2021 https://www.washingtonpost.com/technology/2021/10/22/jan-6-capitol-riot-facebook/ Collected Ocober 19th 2024.

  2. Not stopping ‘Stop the Steal:’ Facebook Papers paint damning picture of company’s role in insurrection by Donie O'Sullivan Tara Subramaniam Clare Duffy, CNN Business, October 24, 2021 https://www.cnn.com/2021/10/22/business/january-6-insurrection-facebook-papers/index.html Collected Ocober 19th 2024.

  3. Facebook Hosted Surge of Misinformation and Insurrection Threats in Months Leading Up to Jan. 6 Attack, Records Show, by Craig Silverman, ProPublica, Craig Timberg, The Washington Post, Jeff Kao, ProPublica, and Jeremy B. Merrill, The Washington Post, Jan. 4, 2022 https://www.propublica.org/article/facebook-hosted-surge-of-misinformation-and-insurrection-threats-in-months-leading-up-to-jan-6-attack-records-show Collected Ocober 19th 2024.

  4. New study shows just how Facebook's algorithm shapes conservative and liberal bubbles. by Huo Jingnan and Shannon Bond, published July 27, 2023 https://www.npr.org/2023/07/27/1190383104/new-study-shows-just-how-facebooks-algorithm-shapes-conservative-and-liberal-bub Collected Ocober 19th 2024.

Reply

or to participate.