0:02 This week's executive briefing is all
0:05 about the future of intelligent pixels.
0:07 We're moving from product as an
0:10 interface bundle to product as a durable
0:13 substrate with pixels as throwaway. And
0:15 I want to dig into what that means. And
0:17 yes, the catalyst for this is Nano
0:19 Banana Pro and the transformation it has
0:20 brought to the way we think about
0:23 images. But I look at Nano Banana Pro as
0:25 really the tip of the spear. I'm not
0:27 interested in whether you think this
0:29 particular model is I'm interested in
0:31 this as a tipping point and we'll see
0:32 more models that are even better than
0:34 this in the future. So what does that
0:37 mean for our software strategies? So I
0:39 want to break this into a few moves and
0:40 we're going to go through them one by
0:42 one and by the end I think you're going
0:44 to see where we're ending up from a
0:45 software build perspective, from a
0:48 software buy perspective, even from a
0:49 talent allocation perspective. So let's
0:52 jump into it. Number one, coherent
0:55 interfaces were an economic hack, not
0:57 necessarily a law of nature. For 40
1:00 years, we treated user interfaces as
1:02 scarce because they were expensive to
1:04 design, they were expensive to build,
1:06 they were expensive to QA, to localize,
1:08 to document, to train on. I still
1:11 remember the days of onprem in the
1:13 basement Oracle servers, right? Like
1:14 that's the world we live in where
1:16 software and hardware were both very
1:18 expensive and that meant when you got an
1:20 interface it had to be shared and serve
1:22 thousands and millions of users and use
1:25 cases. I used Oracle eyes store. Oracle
1:27 eyes store sorry to anyone out there
1:29 who's from Oracle is a terrible terrible
1:32 terrible interface. It is absolutely
1:35 awful. I have deleted half a store
1:38 because of Oracle is store's terrible
1:40 interface, but it had to be shared by
1:42 thousands and millions of users and my
1:44 preferences didn't matter. Interfaces
1:46 had to be durable. You had to amortize
1:48 the design and development cost for
1:49 years. So, Oracle Eyes Store stayed the
1:52 same for a long long time because no one
1:54 wanted to change it and Larry could keep
1:56 making money. So, that meant we
1:58 optimized our organizational structures
2:01 around coherent and longlived
2:05 interfaces. So we would have opinionated
2:06 interaction design. We would have
2:09 navigation. We would have page layouts
2:11 that had very clear mental models
2:12 embedded. We had training. We had
2:14 certifications. Has anyone ever been
2:16 Salesforce certified? Has anyone been
2:18 Workday certified? Anyone certified in
2:21 how to use Jira? This is what I mean.
2:23 This also meant there was huge change
2:26 management overhead for any major UI
2:30 shift. that made sense when every pixel
2:33 essentially had to be handtoled. That is
2:36 no longer true. And we need to recognize
2:39 that this moment, this 2, 3 week period,
2:41 this is the tipping point. We've seen
2:43 signs of it before, but this is the
2:45 moment it all changed. Generative and
2:48 agentic waves are making pixels cheap
2:51 and contextual, and we just hit that
2:53 tipping point in the last couple weeks.
2:55 You have three overlapping shifts
2:57 happening at once and reinforcing each
2:59 other to drive this tip. First,
3:01 generative user interfaces are models
3:04 that can spit out full screens from text
3:08 or context. You have UISard, Vzero,
3:09 Galileo. They already generate
3:11 multis-creen mock-ups from prompts.
3:13 Neielson has talked about this as the
3:16 dawn of cheap disposable UI. I don't
3:19 care how far down the hype train you go.
3:22 I think the key is to recognize that
3:24 ephemeral user interfaces are popping up
3:28 everywhere because they kind of in fact
3:30 there's this entire startup called Wabby
3:32 that just allows you to make generative
3:34 interfaces and software for yourself now
3:36 as a personal consumer. You can have
3:38 generative interfaces and comet
3:40 generative interfaces in other smart
3:42 browsers. So you have generative UI
3:45 becoming a thing. Number two, ephemeral
3:47 and generative UI concepts are
3:50 exploding. And so there's a growing
3:52 conversation around what hyper
3:54 contextual applications or panels might
3:56 look like and how we might create and
3:58 destroy them and keep the application
4:00 state the same. That is different from
4:02 the technology itself. So generative UI
4:03 is about the technology and user
4:06 interface. The idea of UI concepts
4:08 really the design language is growing
4:10 and we need a new design language for
4:12 this this change we're all going
4:14 through. The third trend is agentic
4:16 software that drives other software.
4:18 This is actually funny enough where Nano
4:21 Banana Pro I think rightly comes in.
4:23 Google smartly placed their image
4:26 generator right out of the gate on an
4:29 API so that agents can call it and come
4:32 back with images. People who are
4:35 enterprising are already using this for
4:38 interface design from Nano Banana Pro. I
4:40 am not talking about theory. I'm talking
4:42 about what I actually see on X on Reddit
4:44 other places with screenshots with
4:48 videos. People are using the API call to
4:51 pass a string of data in a structured
4:54 prompt query to NanoBanana Pro and
4:57 retrieve a chart or a graph that they
5:00 can then display as the past week's
5:03 sales, the past days customers, whatever
5:04 it is that they need for internal
5:06 metrics, they can just automatically
5:08 query and get a nice chart back from
5:11 Nano Banana Pro. That is generative
5:14 interface driven by Aentic software.
5:15 Fundamentally, the interface is
5:17 something that is starting to morph
5:20 based on user context and it isn't
5:22 staying fixed anymore. So, if you put
5:25 that together with the idea of throwaway
5:27 pixels, fundamentally, you have software
5:30 that's changing in value. Software is
5:32 becoming generated on demand from intent
5:35 and context. It's becoming private to
5:37 the user in the moment for that
5:39 particular ask. It's becoming discarded
5:41 when that moment passes. One of the most
5:44 instructive descriptions of vibecoded
5:46 apps has been a recognition from folks
5:49 who've done this over 20 30 projects
5:51 that they are finding that these apps
5:53 are valuable in the moment and some of
5:55 them they may use again but some of them
5:56 they created just for a single use and
5:58 that was worth it to them. So Nano
6:01 Banana Pro is basically a futureleaning
6:03 version of this. A model that
6:05 understands UI structures, sketches,
6:07 diagrams, and it flows well enough that
6:10 UI just becomes one more output modality
6:11 like text or code. That's your
6:14 disposable pixels. Before I get too far
6:15 down the road, I don't want you to walk
6:18 away at this point and think Nate thinks
6:21 software is dead or Nate thinks that
6:23 software won't exist anymore. That's not
6:25 true. I think the opposite. But I do
6:27 want to actually talk through what this
6:29 means because I think software is going
6:32 to profoundly change. So let's look at
6:34 what disposable pixels actually look
6:36 like in practice. I want to call out
6:39 three layers. Layer number one is the
6:41 system of record or the system of
6:44 decisioning. So in this sense the things
6:46 that B2B SAS was good at, they don't
6:48 die. They just move downward in the
6:51 stack. So data models, workflows,
6:54 permissions, audits, compliance, things
6:56 that we paid for when we purchased the
6:59 software, things that we pitched when we
7:00 wanted to be entrepreneurs and make
7:02 money off of building stuff. It was this
7:04 hard stuff, right? That's moving down
7:06 the stack. Domain logic, forecasting,
7:09 pricing engines, how you handle uh
7:12 interconnects, APIs, and web hooks. This
7:14 layer, frankly, is durable. It isn't
7:17 going anywhere. Nano Banana Pro is not
7:19 taking that away and neither is any
7:22 other image generator. It is very valued
7:24 dense. It's where Moes live. It's why
7:26 I'm not super worried about Salesforce
7:28 for the medium to long term. Layer
7:32 number two above that system of record
7:35 is intent planning and operation. And
7:37 this is the layer that interprets it
7:39 says show me if you say show me which
7:41 enterprise customers in AMIA have
7:44 renewal risk this quarter and give me a
7:47 CSM touch gap no longer than 45 days and
7:49 then please draft an outreach email.
7:52 That's a series of tasks that an AI
7:55 agent can pick up pass off to other AI
7:57 agents and start to execute against the
8:00 system of record. Layer two is becoming
8:02 an agentic layer. It's not all the way
8:05 there yet, but I don't know anyone who
8:08 operates a B2B SAS company that isn't
8:11 working on some version of layer 2. And
8:13 in fact, most businesses are working on
8:15 some version of layer 2 for their back
8:18 office operation because this kind of
8:21 experience is what we have all wanted
8:23 software to be and we never got a
8:24 chance. If you remember back when I said
8:26 software was something we had to conform
8:29 to, we never really wanted that. We
8:31 wanted software to be more personal and
8:33 with an agentic layer over the top of a
8:36 solid data foundation, we finally have
8:38 that chance. So that means the agent can
8:40 hit your CRM, it can hit your customer
8:42 data warehouse, it can run the queries,
8:43 it can call the email system, the
8:46 ticketing system, it can decide what
8:48 needs a UI and what ought to be
8:51 autoexecuted. All of that can happen and
8:55 then you can finally get to the UX.
8:58 Layer three is pixels, but not pixels as
9:02 the handtoled crafted objects that we
9:04 had to live with back in the Oracle eyes
9:07 store days. I mean pixels as a compiled
9:10 artifact of intent. So only only when it
9:13 needs your judgment does the system
9:15 compile pixels in this model. It might
9:17 be a one-off panel, right? It may have a
9:19 rank table of atrisisk customers. It
9:21 might have an inline suggested outreach.
9:23 It might have a toggle for send now
9:25 schedule and design. And it's a
9:27 transient visualization. It's a specific
9:31 cohort ch cohort chart or funnel for
9:34 this question only and a narrow editor
9:37 UI for exactly one structured decision.
9:39 In other words, we are moving to a world
9:43 where at least some of the UI does not
9:46 generalize. I am not trying to suggest
9:48 that all of the UI is going to be
9:50 composable. And part of why I'm not is
9:52 that we are creatures of habit. We have
9:56 a lot of assumptions around how UI ought
9:58 to look and we do get used to our
10:00 software products pretty quickly and we
10:02 don't like it when they change. I think
10:05 there are going to be common cores in
10:09 our software stacks that remain durable
10:12 even if they're UI. Think of it as the
10:16 homepage for a B2B SAS that shows you
10:18 customer conversations and you want that
10:20 homepage to be easily navigable and you
10:21 don't want it to be new and different. I
10:24 think that kind of UI is here to stay.
10:26 It's not going to be AIdriven. I think
10:28 the key is that there are going to be a
10:31 whole new class of user interfaces that
10:34 nest under that that are going to be
10:36 heavily used that are generative that
10:39 are throwaway that are rendered at
10:42 runtime for that particular person. We
10:44 are arguably already doing this when we
10:46 create an interface on the fly through
10:49 perplexity and then share that throwaway
10:51 interface with one or two other people
10:53 as a way of talking about a topic. We're
10:55 starting to do it in chat GPT when we
10:57 have shared conversations and chat GPT
10:59 creates an artifact that we both view
11:02 together. So interfaces are becoming
11:05 this sort of twoclass object where you
11:07 have durable permanent interfaces that
11:10 may be a common core that has high habit
11:12 that is the front door of the
11:14 application and this disposable layer
11:16 that sort of makes up for a lot of the
11:19 pages that were handtoled before that
11:21 never got a lot of traffic. Anyone who
11:23 has managed a SAS application will tell
11:26 you that traffic decays stochastically.
11:28 Traffic decays like this on an
11:30 exponential curve and your top two or
11:32 three pages account for most of your
11:33 traffic. But you have to put just as
11:35 much work into all these other pages
11:37 that only a couple of people want. Those
11:40 are the pages that I think are largely
11:42 at risk during this transition. We are
11:45 going to see SAS applications that only
11:48 have two or three main pages and
11:50 everything else may be generated for the
11:52 user on the fly. Sure, the user may be
11:54 able to save it in some place so they
11:55 can come back to it if they like that
11:58 particular view, but fundamentally
11:59 they're going to be much more composable
12:02 than and that brings me, I think, to a
12:04 chance to talk about the differences
12:06 here because I want to be really clear
12:09 about how different a coherent
12:11 consistent handtoled interface is versus
12:14 a disposable pixel pixel interface. If
12:17 you want to lay that out and talk about
12:19 different horizons and axes of value,
12:22 they could not be more different. The
12:24 time horizon for a traditional interface
12:26 is measured in months at best. And for
12:28 disposable pixels, it can be done in
12:30 seconds. The design target like you
12:32 often have a lot of people focused on
12:34 personas, roles, generalized workflows
12:36 for your fancy interface. And for
12:39 disposable pixels, the agent is going to
12:41 decide. It's not going to be a human.
12:43 the agent is going to put a user, a
12:45 moment, and a goal together and go
12:47 somewhere. Your mental model for a
12:50 coherent interface app is learn this
12:52 app. And I think that is actually one I
12:54 would really like to emphasize from a
12:56 talent perspective. Most of the talent
12:58 at tech companies and at non- tech
13:01 companies still has the mental model of
13:03 learn this app and they've brought that
13:06 with them to chat GPT in the AI era.
13:09 that does not serve you because the
13:11 world we're moving to with disposable
13:14 pixels is more like what AI actually is.
13:17 State your intent, do the prompt, and UI
13:20 appears when needed. And that could not
13:22 be more different than assuming that the
13:24 app is static and you can learn it. And
13:26 I think so much of the time we assume
13:28 that the cost structure, I've called
13:30 this out, it's so different. Instead of
13:32 a heavy upfront cost for traditional
13:34 software, disposable pixels, they have
13:36 heavy model training, but the pixels are
13:39 functionally free. The models have been
13:41 paid for and you can get cheap, cheap,
13:43 cheap iteration. Even Nano Banana Pro,
13:45 which is relatively expensive now and
13:47 will get cheaper, it's still dirt cheap,
13:50 relatively speaking. The consistency is
13:52 something I want to call out. This gets
13:54 viewed as a concern for a lot of
13:56 generative interfaces. Consistency value
13:58 is obviously very high for traditional
14:01 software. It is mostly inside the agent
14:04 planning and the durable state and
14:06 record layer. It is not in the pixels.
14:07 And I think that a lot of times
14:10 proponents of generative UI fail to make
14:12 this connection. They tend to say that
14:14 generative UI is whatever you want it to
14:16 be without recognizing that it has to
14:19 rest on a durable software substrate
14:22 that does not change that is not
14:24 ephemeral. Differentiation or how
14:26 software differentiates from others is
14:28 also in and of itself different. So let
14:30 me explain what I mean. In the
14:32 traditional software days, if you were
14:35 pitching your software as VC era, we
14:36 were software is better. You would call
14:38 out look and feel. You would call out
14:40 interaction design. You call out UX
14:42 patterns. You'd call out the the smarts
14:43 of the machine learning inside. You
14:45 would call out the cleanness and
14:46 efficiency of your workflow. The way
14:48 you'd understood the problem. With
14:50 disposable pixels, you call out the
14:53 outcomes because the AI agents are doing
14:54 more and more of the work. You would
14:56 call out the speed from intent to
14:59 action. And as an example of speed, it
15:04 took me 10 seconds to craft a perfect
15:10 chart of GDP annually in the US and
15:13 Germany compared on the same chart in
15:17 Nano Banana Pro from 1960 to 2025. 10
15:18 seconds. You're not going to beat that
15:21 with a traditional BI tool. The speed
15:24 from intent to action is addictive and
15:26 it is driving consumer and business
15:28 behavior. And I think that we are
15:30 fooling ourselves if we think anything
15:32 else. Look, coherent interfaces are not
15:34 going to disappear. They're just going
15:36 to stop being the default shape of
15:37 software. They're going to become
15:39 perhaps a fallback when tasks are
15:41 ambiguous. They're going to become a
15:43 shared frame for multi-user
15:45 collaboration. They're going to become a
15:46 meta surface where you orchestrate
15:48 agents. It's just going to look
15:50 different. I want to go a bit deeper
15:52 here on the B2B SAS side partly because
15:54 I am very deep in B2B SAS myself and I
15:56 think this also hits B2B SAS profoundly
15:58 and I want to call that out for if
16:01 you're like buying B2B SAS or if you're
16:03 a leader in B2B SAS if you're a builder
16:04 in B2B SAS that should cover a lot of
16:07 folks. This is a big deal. So the
16:08 disposable pixel story is extra
16:10 complicated and I think it justifies a
16:12 little sidebar here. First I want to
16:15 call out that right now today a ton of
16:18 the enterprise value is framed around
16:20 this idea that we own the primary
16:22 surface where the job happens right so
16:25 CRM think that way ERPs think that way
16:27 HR information systems think that way
16:30 PLG analytics systems think that way if
16:32 the primary interaction moves to an
16:35 agent or co-pilot surface then your own
16:38 UI is just a reference implementation
16:40 it's not the default touch point anymore
16:42 and so your API behavior behavior, your
16:44 data semantics matter more than your
16:47 navigation bar. So the bundling power
16:49 shifts from is this the system with the
16:51 best dashboard, which is what sales has
16:53 sold on in B2B SAS for a really long
16:56 time, to is this the system that is
16:59 easiest for agents to choreograph. And I
17:01 think a lot of companies don't have a
17:03 good answer to this. Also means that UI
17:05 is becoming a product surface that you
17:07 do not fully control. If customers are
17:09 using generative UI tools on top of your
17:12 APIs, they are letting their own
17:14 internal design systems and their own
17:17 models render their own views of your
17:19 data. And then your Canon UI is just one
17:21 of many frontends. And so you're
17:22 competing with internal task panels,
17:25 with co-pilot generated micro apps, with
17:27 perhaps a third party universal
17:29 workspace tool that comes along. In
17:32 other words, you are at risk of
17:34 disintermediating the relationship
17:36 because you get aggregated with many
17:38 other SAS products behind one agentic
17:41 interface. And so where SAS still wins
17:45 is where it's able to be a substrate as
17:47 a service where you own the canonical
17:48 state for something, the contracts, the
17:50 ledgers, the records, the risk models,
17:52 whatever it is. And that means that you
17:55 are embedded in domain flows that track
17:57 real value. So SLAs's compliance
17:59 reference data being safe and
18:02 predictable for agents to call is a way
18:04 to win. So if you have strong schemas,
18:06 if you have good safeguards, if you have
18:09 item potent item potency, say that three
18:11 times fast, in a disposable pixel world,
18:14 you become less of a thing with screens,
18:16 which is what most software has been,
18:18 and more of a high integrity service
18:21 that agents and generators can rely on.
18:23 Let's transition to the talent side.
18:25 What happens to designers, PMs, and
18:28 engineers in a world where we start to
18:30 have generative UI? For designers, you
18:32 have to shift the way you think, right?
18:33 You're the designers on your team, the
18:35 designers you hire. If you're a designer
18:36 listening to this, you are moving from
18:39 owning specific flows and screens pretty
18:42 rapidly into defining interface
18:44 grammarss, into defining constraints,
18:47 into like figuring out safe snap points
18:49 for generative UI. You are becoming
18:52 language designers and safety engineers
18:55 for human attention. If you're a PM,
18:56 you're used to a world where what
18:58 feature or page do we build next is the
19:01 core question. You're moving to a world
19:04 where what intents do we support? What
19:06 state changes must be safe? What
19:08 decisions need human judgment versus
19:11 being fully automated? So instead of
19:13 just creating a static wireframe, you're
19:15 moving to a world where you're trying to
19:17 spec out intent, state, and outcome
19:19 loops. And that's really different.
19:21 Engineers, especially front-end
19:23 engineers, are used to front-end pixel
19:25 pushing. And now you need to start
19:27 thinking about building stable
19:30 interfaces for agents and generators and
19:32 a thin canonical shell. You may want to
19:34 build something that enables those snap
19:35 points. You may want to build something
19:38 that enables validation logic. You may
19:40 want to build something that enables a
19:42 degree of composability within safe
19:44 constraints. And so your interface
19:46 backlog for for designers, PMs, and
19:48 engineers begins to change here because
19:50 instead of traditional tickets that come
19:52 in in Jera, you have new intents that
19:53 you want to support, new system
19:55 behaviors, new constraints or
19:57 invariants, new components or layouts
19:59 the generator might use. It's not just
20:01 add another settings page. Now I do want
20:04 to call out there are places where
20:07 coherent traditional software still
20:10 wins. Cognitive mapping is a big one. So
20:11 humans do like stable landmarks. I
20:13 mentioned this earlier. If you are doing
20:15 complex work like trading, like
20:18 medicine, incident response, people rely
20:20 on deep spatial memory of their tools.
20:22 Completely shifting pixels every time
20:25 adds cognitive load and risk. This is
20:26 one of the places where I think
20:29 perplexity is making an incorrect choice
20:31 in the finance space. Bloomberg terminal
20:34 may look like a maze to most people, but
20:36 it is software that people with a deep
20:38 spatial memory of the tools rely on for
20:40 complex work. It is not getting
20:43 disintermediated by perplexity finance.
20:45 Whatever perplexity says there's a floor
20:48 of coherence that you cannot cross
20:50 without hurting performance. I would
20:51 also like to call out audit, training,
20:53 and compliance is a big flow here.
20:55 Regulated environments need very
20:57 reproducible flows. Show me exactly what
21:00 the user saw when they approved the loan
21:02 is not something where you can say it
21:05 was a generative interface. So IDK like
21:06 that's not going to work with an
21:09 auditor. Ephemeral UIs make this very
21:10 hard unless you can capture and version
21:12 the UI spec itself as a first class
21:15 artifact and that gets very very
21:17 complicated very very fast. I think that
21:19 the the incentives are strongly in favor
21:22 of coherent software there. Team
21:24 collaboration is probably also a space
21:25 where you're going to see coherent
21:27 software. So shared work needs shared
21:29 views. Look at this dashboard. Check
21:31 this queue. And if everyone has a
21:32 different ephemeral panel, you need
21:34 explicit mechanisms for pinning, for
21:36 sharing, for standardizing those panels.
21:38 I am going to go out on a limb and I'm
21:40 going to suggest I don't know this is
21:42 true, but I'm going to suggest that
21:44 Slack has basically this vision for
21:46 their product roadmap. Slack is becoming
21:50 a place that is benefiting from the move
21:54 to generative UI. Not because Slack is
21:56 itself a generative UI. It's very
21:59 stable, but because it is stable and it
22:01 is a place where teams collaborate and
22:04 know the interface well. It is a place
22:06 where all those hooks that Slack has
22:09 built into other tools can become
22:11 passively agentified. The agentified
22:13 benefits can just flow into Slack as a
22:15 value proposition. And so when people
22:17 build charts in Nano Banana Pro, the
22:20 demo videos they do always show them
22:22 popping the chart back into Slack where
22:24 the team can see it. That is not an
22:27 isolated incident. That is where Slack's
22:29 value proposition is starting to shift
22:32 as a stable team collaboration substrate
22:35 in a generative UI world. So the mature
22:37 pattern is probably a spectrum. You're
22:39 going to have highly standardized and
22:41 coherent shells for regulated flows, for
22:43 shared operational views, for team
22:44 training and onboarding, for team
22:46 collaboration, and you're going to have
22:49 disposable pixels that operate inside
22:51 that shell for exploratory analysis, for
22:53 micro decisions, for personalized
22:55 shortcuts, for just for me flows. I want
22:58 to suggest to you that it is okay that
23:00 we have both and that we do not have to
23:04 insist on a binary fight like I see so
23:06 many times where people will say B2B SAS
23:09 is dead and only generative UI is the
23:10 future. We will never have stable
23:13 interfaces. That's a terrible take. But
23:15 an equally terrible take is we will
23:18 never see generative UI interfaces in
23:20 serious SAS applications. That is just
23:23 not true. And anyone who has managed a
23:26 serious SAS application as I have will
23:27 tell you that we have hundreds or
23:29 thousands of pages that we're managing,
23:32 many of which we would dearly love to
23:34 make generative because they're so
23:36 expensive to maintain through the
23:38 traditional rubric. And so when I step
23:40 back and look at the implication of this
23:42 nano banana moment for builders, for
23:44 leaders, for talent, I think the thing
23:46 that I want to leave you with is this.
23:49 Software really is decoupling. It's
23:51 decoupling into a substrate that needs
23:53 to be stable and a pixel that matters a
23:56 whole lot less. If you are in the
23:59 business of either pixels or substrates,
24:01 this is going to affect you. You should
24:02 pay attention. You should think about
24:04 your moat. Is your mode on the
24:06 substrate? You should think if you're in
24:08 talent, if you're in design, if you're
24:10 in PM, if you're in engineering, where
24:13 are you at in relation to the substrate
24:15 and the pixels? Are you stuck in a world
24:17 where you're pushing coherent software
24:19 and you don't see a way forward or are
24:21 you moving to that world where you have
24:23 the substrate, the agentic intelligent
24:25 layer and the disposable pixel? I do
24:28 believe B2B SAS survives as the
24:31 substrate and there will be coherent
24:33 cores that survive up to the UI layer,
24:35 data providing agentic intelligence
24:37 layers over the top etc. But
24:41 fundamentally pixels themselves as the
24:43 the single coherent interface for a
24:46 product are going to go away. We have
24:49 seen that going away for a while as BI
24:51 teams have leaned more and more into
24:54 just give me the data for data platforms
24:57 and data vendors. They don't want the
24:59 fancy dashboard the sales guys sell.
25:01 They just want the data. Well, now we're
25:03 moving to a world where it's not just
25:05 the data science team saying that. It's
25:07 the marketers. It's everybody is saying
25:09 that. So who wins? Products that are
25:11 agent addressable. Products that are
25:13 schema clean. Products that can be
25:16 composed. Teams that treat UI as a
25:19 language and a runtime, not as a set of
25:21 frozen screens. And that goes for you as
25:23 an individual. It goes for the people
25:25 you hire. Who loses? Products whose only
25:27 mode is that your interface is
25:29 beautiful? Vendors who resist being
25:31 called by higher level agents and insist
25:33 that users live inside their monolith.
25:35 Like you can only do that for so long.
25:37 people will find a way around it. One of
25:39 the implications of nano banana is that
25:41 a computer use agent that is very good
25:43 is not far behind. And even if you
25:45 insist on living in the monolith, you
25:48 could see a world in 2026 where the user
25:49 can just get up in the morning, have a
25:51 voice conversation with an agent, and
25:55 the agent can use a tool to go and
25:57 browse the monolith software that you
26:00 insist only a human can use, extract the
26:02 data, and bring it back to the user. The
26:04 user is going to be able to make their
26:06 choices. The user is going to be able to
26:07 choose their interface. This is going to
26:09 be true for consumer. It's going to be
26:11 true for business. And it's going to