👨🏼‍💻

khriztianmoreno's Blog

Home Tags About |

Posts with tag chrome

Gemini in Chrome: The Dawn of the Agentic Web

2026-01-28
chromeaiweb-developmentfuture

The recent launch of Gemini in Chrome is not just another browser update; it is a clear signal of a paradigm shift we have been anticipating: the transition from a passive web to an agentic web.Until now, the browser was a window for you to consume content. With the integration of agentic capabilities (like Auto-Browse) and local models (Nano Banana), Chrome becomes an active actor that can reason, navigate, and execute.For us as developers, this changes the game. It is not just about "generative AI", it is about how our sites and applications will be consumed by artificial intelligences acting on behalf of users.The End of "Just Browsing"The most disruptive feature is undoubtedly Auto Browse. Gemini can now handle multi-step flows: researching, comparing, and completing forms.What Does This Mean for the Ecosystem?User Interface (UI) becomes optional: In many cases, the user won't see your beautiful CSS design. They will see the result processed by the agent. If your site relies purely on human visual interaction to convert, you are at risk.The Revenge of Semantic HTML: For years, many modern frameworks have abused "div-soup". An AI agent needs to understand the structure of your page to navigate it. A <div> with an onClick is not the same as a <button> or an <a> for a model trying to understand possible actions.Universal Commerce Protocol (UCP): Standardization of shopping is a double-edged sword. It reduces friction for the user, but commoditizes the shopping experience. If your checkout is complex or non-standard, the agent might fail or prefer the competition.Nano Banana: AI at the Real "Edge" (The Client)The integration of image models ("Nano Banana") directly into the browser eliminates network latency and server costs.The Opportunity for Devs: We can start building rich content editing and generation experiences without burning credits on expensive APIs. Privacy improves drastically as data does not leave the device. However, this transfers the computational load to the user's device. Resource optimization (battery, memory) will be more critical than ever.What Should We Start Working On?As critical developers looking towards the future, here is our immediate task list:1. Accessibility as SEO for AgentsAccessibility (a11y) has always been important for users; now it is critical for your business survival. AI agents use the Accessibility Tree and semantic DOM to understand your site.Action: Review your ARIA roles, use semantic tags (<article>, <nav>, <main>), and ensure your forms have clear labels.2. Structured Data (JSON-LD)Don't trust the agent to "read" your text. Feed it explicitly. Implementing Schema.org and JSON-LD is no longer just to look pretty in Google Search; it is the instruction manual for Gemini to know what products you sell, how much they cost, and how to buy them.3. APIs for the FrontendWe must think of our frontends not just as API consumers, but as exposed sources of information. If your application is a "black box" rendered entirely by JavaScript without accessible intermediate states, you will be invisible to agents.4. Prepare for UCPInvestigate the Universal Commerce Protocol. If you manage an e-commerce, compatibility with this protocol could be the difference between an automatic sale managed by Gemini and an abandoned cart.ConclusionGemini in Chrome is the canary in the coal mine. The web is evolving from a catalog of documents to an execution environment for agents.The "SEO" of the future will not be about keywords, but about agentic readability. Developers who continue to build only for human eyes will be left behind. It is time to return to the fundamentals of the open, structured, and semantic web, but with the power of modern AI.ReferencesChrome’s next chapter with GeminiI hope this has been useful and/or taught you something new!Profile@khriztianmoreno �

Mastering Chrome DevTools for Web Performance Optimization

2025-11-17
performancedevtoolschromechrome-devtools

Turn Chrome DevTools from a viewer into a performance debugging weapon.Most teams open DevTools too late. Or they look at the wrong panels, drowning in noise while missing the signals that actually affect user experience.If you are a senior frontend engineer or performance owner, you know that "it feels slow" isn't a bug report—it's a symptom. This guide is for those who need to diagnose that symptom, understand the root cause, and verify the fix.We are focusing on Chrome DevTools features that directly map to Core Web Vitals. No fluff, just the workflows you need to fix Interaction to Next Paint (INP), Largest Contentful Paint (LCP), and Cumulative Layout Shift (CLS).1. Mental model: from symptom to root causeBefore clicking anything, you need the right mental model. Metrics tell you what is wrong. DevTools explains why.Performance isn't about magic numbers; it's about the Main Thread. The browser's main thread is where JavaScript runs, HTML is parsed, and styles are calculated. It is a single-lane highway. If a heavy truck (a long task) is blocking the lane, fast cars (user clicks, animations) are stuck in traffic.Key Rule: If the main thread is blocked, UX is broken.2. Performance panel: the center of truthThe Performance panel allows you to record exactly what the browser is doing over a period of time. It records:Main thread activity: JS execution, parsing, GC.Rendering pipeline: Style calc, Layout, Paint, Compositing.Network timing: When resources are requested and received relative to execution.User input handling: How long the browser took to respond to a click.Recording a useful traceIdle traces are useless. You need interaction traces.Open DevTools (Cmd+Option+I / Ctrl+Shift+I) and go to the Performance tab.Check Screenshots and Web Vitals in the capture settings. Memory is usually optional unless you suspect a leak.Click the Record button (circle icon).Interact with the page (click the button, scroll the list, open the modal).Click Stop.3. Reading the Performance timelineThe resulting trace can be intimidating. Ignore 90% of it initially. Focus on these sections: FPS & CPU: High-level health check. Solid blocks of color in CPU mean the main thread is busy.Network: Thin lines showing resource loading order.Main: The flame chart of call stacks. This is where you spend most of your time. Frames: Screenshots of what the user saw at that millisecond.The Experience TrackThis is your best friend. It explicitly marks:LCP: Where the Largest Contentful Paint occurred.Layout Shifts: Red diamonds indicating CLS.Long Tasks: Tasks taking >50ms (red triangles).Spotting Long TasksA "Long Task" is any task that keeps the main thread busy for more than 50ms. In the Main section, look for gray bars with red triangles at the top corner. These are the tasks blocking the browser from responding to user input (INP).4. Debugging LCP with DevToolsLCP measures loading performance. To fix it, you need to know what the element is and why it was late.Identify the LCP element: In the Timings or Experience track, find the LCP marker.Inspect the element: Hovering over the LCP marker often highlights the actual DOM node.Analyze the delay:Resource Load Delay: Was the image discovery late? (e.g., lazy-loaded hero image).Resource Load Duration: Was the network slow or the image too large?Render Delay: Was the image loaded but waiting for a main-thread task to finish before painting?Typical LCP root causes:Late discovery: The <img> tag is generated by JavaScript or has loading="lazy".Render blocking: Huge CSS bundles or synchronous JS in the <head> pausing the parser.Server TTFB: The backend took too long to send the initial HTML.<!-- ❌ Bad: Lazy loading the LCP element (e.g. Hero image) --> <img src="hero.jpg" loading="lazy" alt="Hero Image" /> <!-- ✅ Good: Eager loading + fetchpriority --> <img src="hero.jpg" loading="eager" fetchpriority="high" alt="Hero Image" />Reference: Optimize Largest Contentful Paint5. Debugging INP with DevToolsINP is the metric that kills single-page applications (SPAs). It measures the latency of user interactions.Use the Interactions track: Look for the specific interaction (click, keypress) you recorded.Expand the Interaction: You will see it broken down into three phases:Input Delay: Time waiting for the main thread to become free.Processing Time: Time running your event handlers.Presentation Delay: Time waiting for the browser to paint the next frame.Visually correlate with the Main Thread: Click the interaction bar. Look directly below it in the Main track.If you see a massive yellow block of JavaScript under the interaction, your event handler is too slow (Processing Time).If you see a massive block of JS before the interaction starts, the main thread was busy doing something else (Input Delay).Common offenders:JSON parsing large payloads.React/Vue reconciliation (rendering too many components).Synchronous loops or expensive calculations.// ❌ Bad: Blocking the main thread with heavy work button.addEventListener("click", () => { const result = heavyCalculation(); // Blocks for 200ms updateUI(result); }); // ✅ Good: Yielding to main thread button.addEventListener("click", async () => { showSpinner(); // Yield to main thread so browser can paint the spinner await new Promise((resolve) => setTimeout(resolve, 0)); const result = heavyCalculation(); updateUI(result); });Fix workflow: Identify the function in the flame chart → Optimize or defer it → Record again → Verify the block is smaller.Reference: Interaction to Next Paint (INP)6. Debugging CLS with DevToolsLayout shifts are annoying and confusing. DevTools visualizes them clearly.Open the Command Menu (Cmd+Shift+P / Ctrl+Shift+P) and type "Rendering".Enable "Layout Shift Regions".As you interact with the page, shifted elements will flash blue.In the Performance Trace: Look at the Experience track for red diamonds. Click one. The Summary tab at the bottom will list exactly which nodes moved and their previous/current coordinates.Common CLS patterns:Font swaps (FOUT/FOIT): Text renders, then the web font loads, changing the size.Image resize: Images without width and height attributes.Late injected UI: Banners or ads inserting themselves at the top of the content./* ❌ Bad: No space reserved for image */ img.hero { width: 100%; height: auto; } /* ✅ Good: Reserve space with aspect-ratio */ img.hero { width: 100%; height: auto; aspect-ratio: 16 / 9; }Reference: Optimize Cumulative Layout Shift7. Live Metrics screenThe Live Metrics view (in the Performance panel sidebar or landing page) provides real-time feedback without a full trace.Why it matters:Instant feedback: See LCP and CLS values update as you resize the window or navigate.Field-aligned: It uses the same implementation as the Web Vitals extension.Use cases:Testing hover states and small interactions.Validating SPA route transitions.Quick sanity checks before committing code.Note: This is still "Lab Data" running on your machine, not real user data (CrUX).8. Insights panelThe Performance Insights panel is an experimental but powerful automated analysis layer. It uses the trace data to highlight risks automatically.Key features:Layout Shift Culprits: It points directly to the animation or DOM update that caused a shift.Render blocking requests: identifies CSS/JS that delayed the First Contentful Paint.Long main-thread tasks: suggestions on how to break them up.Use Insights as a hint, not a verdict. It points you to the right place in the flame chart, but you still need to interpret the code.9. CPU and Network throttling (Mandatory)Developing on a MacBook Pro with fiber internet is a lie. Your users are on mid-tier Android devices with spotty 4G.CPU Throttling:set to 4x slowdown. This roughly simulates a mid-range Android device. It exposes "death by a thousand cuts"—small scripts that feel instant on desktop but freeze a phone for 300ms.Network Throttling: Fast 4G or Slow 4G. Critical for debugging LCP (image load times) and font loading behavior.Fast Wi-Fi hides bad engineering. Always throttle when testing performance.10. Putting it all together: a repeatable workflowDetect: Use PageSpeed Insights or CrUX to identify which metric is failing.Reproduce: Open DevTools, enable Throttling (CPU 4x, Network 4G).Record: Start tracing, perform the user action, stop tracing.Inspect: Find the red/yellow markers in the Experience/Main tracks.Fix: Apply the code change (defer JS, optimize images, reduce DOM depth).Verify: Re-record and compare the trace. Did the long task disappear? Did the LCP marker move left?ConclusionDevTools is not optional. Performance is observable. Every Core Web Vitals issue leaves a trace; you just need to know where to look.If you cannot explain a performance problem in DevTools, you do not understand it yet.Resources:Chrome DevTools Documentationweb.dev Performance GuidesGoogle Search Central CWV DocsI hope this has been helpful and/or taught you something new!Profile@khriztianmorenoUntil next time

Navigating the Future - Understanding and Measuring Soft Navigations for SPAs

2025-11-12
performanceweb-developmentjavascriptchrome

Explore the concept of "soft navigations" in Single Page Applications (SPAs), why traditional Core Web Vitals measurement has been challenging for them, and the ongoing efforts by the Chrome team to standardize and enable reporting for these dynamic content changes.

Demystifying Core Web Vitals - A Developer's Guide to LCP, INP, and CLS

2025-10-19
web-performancecore-web-vitalslighthouseweb-developmentcruxchromeperformancedevtoolschrome-devtools

Core Web Vitals are ranking signals, but most teams still optimize them like lab-only scorecards. This guide turns CWV into actionable engineering work: how to measure (field + lab), how to debug root causes in DevTools, and which fixes actually move the 75th percentile.

Introducing Chrome DevTools MCP

2025-09-30
javascriptchromedevtoolsaimcpdebuggingperformancechrome-devtools

I participated in the Chrome DevTools MCP Early Access Program and put the feature through its paces on real projects. I focused on four scenarios: fixing a styling issue, running performance traces and extracting insights, debugging a failing network request, and validating optimal caching headers for assets. This post shares that hands-on experience—what worked, where it shines, and how I’d use it day-to-day.Chrome DevTools MCP gives AI coding assistants actual visibility into a live Chrome browser so they can inspect, test, measure, and fix issues based on real signals—not guesses. In practice, this means your agent can open pages, click, read the DOM, collect performance traces, analyze network requests, and iterate on fixes in a closed loop.