<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[#100DaysOfAI by Karan Balaji]]></title><description><![CDATA[A daily #100DaysOfAI blog on Generative UIs, LLMs, AI product design, and real world examples shaping how we build software.]]></description><link>https://blog.karanbalaji.com</link><generator>RSS for Node</generator><lastBuildDate>Mon, 20 Apr 2026 12:31:18 GMT</lastBuildDate><atom:link href="https://blog.karanbalaji.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Day 5/100: Open Source AI SDKs and Frameworks for Next-Gen Agents]]></title><description><![CDATA[Intro to AI SDKS & Agent Frameworks 2026
In the fast-evolving world of artificial intelligence, open source SDKs and frameworks are empowering developers to create autonomous agents that handle complex tasks with ease. As we step into 2026, these too...]]></description><link>https://blog.karanbalaji.com/100-days-of-ai-day-5-open-source-ai-sdks-and-frameworks-2026</link><guid isPermaLink="true">https://blog.karanbalaji.com/100-days-of-ai-day-5-open-source-ai-sdks-and-frameworks-2026</guid><category><![CDATA[AI]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Artificial Intelligence]]></category><category><![CDATA[technology]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Sun, 04 Jan 2026 21:09:20 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1767560883767/bbef19ba-24e2-48a2-8764-813d637646f5.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-intro-to-ai-sdks-amp-agent-frameworks-2026">Intro to AI SDKS &amp; Agent Frameworks 2026</h1>
<p>In the fast-evolving world of artificial intelligence, open source SDKs and frameworks are empowering developers to create autonomous agents that handle complex tasks with ease. As we step into 2026, these tools are transforming how we build AI applications, from simple chatbots to multi-agent systems that collaborate like human teams. This article dives deep into the leading open source options, including TanStack AI, Vercel AI SDK (version 6), Google Agent Development Kit (ADK), Microsoft Agent Framework, OpenAI Agents SDK, LangChain/LangGraph, Mastra, PydanticAI, and emerging ones like Agno (formerly Phidata), CrewAI, and DSPy. Backed by the latest research from 2025-2026 sources, we'll explore their features to help you choose the right one for your projects.</p>
<h2 id="heading-what-are-open-source-ai-sdks-and-frameworks">What Are Open Source AI SDKs and Frameworks?</h2>
<p>Open source AI SDKs and frameworks are libraries and toolkits that simplify the development of AI agents—software entities that perceive environments, make decisions, and take actions autonomously. These tools abstract away complexities like LLM integrations, tool calling, memory management, and orchestration, allowing developers to focus on innovation.</p>
<p>At their core, they provide building blocks for agentic AI: prompts for guiding models, tools for external interactions (like APIs or databases), memory for context retention, and workflows for multi-step reasoning. For instance, frameworks like LangChain emphasize modular chains, while others like Microsoft Agent Framework focus on enterprise-scale multi-agent collaboration.</p>
<p>Key trends in 2026 include model-agnostic designs (supporting OpenAI, Anthropic, Gemini, and more), support for standards like Model Context Protocol (MCP) for interoperability, and emphasis on observability for debugging production agents. Open source status ensures community-driven updates, cost-free access, and customization, making them ideal for startups and enterprises alike. With over 80K GitHub stars for leaders like LangChain, these frameworks are battle-tested and rapidly adopted.</p>
<h2 id="heading-how-to-implement-them-practical-examples-and-use-cases">How to Implement Them: Practical Examples and Use Cases</h2>
<p>Implementing these frameworks involves setting up environments, defining agents, integrating tools, and deploying workflows. Below, we break down each major one with code snippets and real-world applications, drawing from official docs and 2026 benchmarks.</p>
<h3 id="heading-tanstack-ai">TanStack AI</h3>
<p>TanStack AI is a lightweight, type-safe SDK for production AI experiences, with strong React and Solid integrations. It's model-agnostic and excels in streaming responses.</p>
<p><strong>Implementation Example:</strong> Define tools and chat sessions in TypeScript.</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">import</span> { chat } <span class="hljs-keyword">from</span> <span class="hljs-string">'@tanstack/ai'</span>;
<span class="hljs-keyword">import</span> { toolDefinition } <span class="hljs-keyword">from</span> <span class="hljs-string">'@tanstack/ai'</span>;
<span class="hljs-keyword">import</span> { openaiText } <span class="hljs-keyword">from</span> <span class="hljs-string">'@tanstack/ai-openai'</span>;

<span class="hljs-keyword">const</span> getWeatherDef = toolDefinition({
  name: <span class="hljs-string">'getWeather'</span>,
  inputSchema: z.object({ city: z.string() }),
  outputSchema: z.object({ temp: z.number() }),
});

<span class="hljs-keyword">const</span> getWeather = getWeatherDef.server(<span class="hljs-keyword">async</span> ({ city }) =&gt; {
  <span class="hljs-comment">// Fetch from API</span>
  <span class="hljs-keyword">return</span> { temp: <span class="hljs-number">72</span> };
});

chat({
  adapter: openaiText(<span class="hljs-string">'gpt-4o'</span>),
  messages: [{ role: <span class="hljs-string">'user'</span>, content: <span class="hljs-string">'Weather in NYC?'</span> }],
  tools: [getWeather],
});
</code></pre>
<p><strong>Use Cases:</strong> Building real-time chat interfaces for e-commerce recommendations or multimodal apps processing images and text.</p>
<h3 id="heading-vercel-ai-sdk-version-6">Vercel AI SDK (Version 6)</h3>
<p>Vercel AI SDK unifies LLM integrations for web apps, supporting React, Next.js, and more. It's streaming-first and provider-agnostic.</p>
<p><strong>Implementation Example:</strong> Generate text across models.</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">import</span> { generateText } <span class="hljs-keyword">from</span> <span class="hljs-string">"ai"</span>;

<span class="hljs-keyword">const</span> { text } = <span class="hljs-keyword">await</span> generateText({
  model: <span class="hljs-string">"anthropic/claude-sonnet-4.5"</span>,
  prompt: <span class="hljs-string">"Explain quantum computing simply."</span>,
});
</code></pre>
<p><strong>Use Cases:</strong> Creating RAG-based knowledge bases for internal tools or semantic search in apps like customer support portals.</p>
<h3 id="heading-google-agent-development-kit-adk">Google Agent Development Kit (ADK)</h3>
<p>Google ADK is a modular, open source framework for agent orchestration, optimized for Gemini but model-agnostic.</p>
<p><strong>Implementation Example:</strong> While specific code isn't detailed, setup involves containerization for deployment.</p>
<pre><code class="lang-python"><span class="hljs-comment"># Conceptual: Define sequential workflow</span>
<span class="hljs-keyword">from</span> adk <span class="hljs-keyword">import</span> SequentialWorkflowAgent

agent = SequentialWorkflowAgent(tools=[search_tool])
result = agent.run(<span class="hljs-string">"Research AI trends"</span>)
</code></pre>
<p><strong>Use Cases:</strong> Multimodal agents for video/audio processing in healthcare diagnostics or secure GCP deployments for compliance-heavy industries.</p>
<h3 id="heading-microsoft-agent-framework">Microsoft Agent Framework</h3>
<p>This unified open source engine combines AutoGen and Semantic Kernel for multi-agent systems, with strong enterprise features.</p>
<p><strong>Implementation Example:</strong> Define and orchestrate agents.</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> agent_framework <span class="hljs-keyword">import</span> Workflow, ai_function

<span class="hljs-meta">@ai_function</span>
<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">search</span>(<span class="hljs-params">query: str</span>) -&gt; str:</span>
    <span class="hljs-keyword">return</span> <span class="hljs-string">"Results"</span>

workflow = Workflow(agents=[researcher])
result = workflow.run(<span class="hljs-string">"Analyze trends"</span>)
</code></pre>
<p><strong>Use Cases:</strong> Automating workflows in finance (e.g., fraud detection) or consulting, with human-in-the-loop approvals.</p>
<h3 id="heading-openai-agents-sdk">OpenAI Agents SDK</h3>
<p>A minimalist Python SDK for multi-agent workflows, with built-in tracing and guardrails.</p>
<p><strong>Implementation Example:</strong> Simple agent setup.</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> agents <span class="hljs-keyword">import</span> Agent, Runner

agent = Agent(name=<span class="hljs-string">"Helper"</span>, instructions=<span class="hljs-string">"Assist user"</span>)
result = Runner.run_sync(agent, <span class="hljs-string">"Solve 2+2"</span>)
print(result.final_output)
</code></pre>
<p><strong>Use Cases:</strong> Customer support bots with handoffs or educational tools for interactive learning.</p>
<h3 id="heading-langchain-langgraph">LangChain / LangGraph</h3>
<p>LangChain builds LLM chains; LangGraph adds graph-based workflows for stateful agents.</p>
<p><strong>Implementation Example:</strong> Create a reactive agent.</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> langgraph <span class="hljs-keyword">import</span> create_react_agent

agent = create_react_agent(model=<span class="hljs-string">'gpt-5'</span>, tools=[tool])
response = agent.invoke(<span class="hljs-string">"Query data"</span>)
</code></pre>
<p><strong>Use Cases:</strong> Complex orchestrations like supply chain automation or research agents in academia.</p>
<h3 id="heading-mastra">Mastra</h3>
<p>Mastra focuses on TypeScript-based AI apps with workflows and human-in-the-loop.</p>
<p><strong>Implementation Example:</strong> Define an agent workflow.</p>
<pre><code class="lang-typescript"><span class="hljs-comment">// Conceptual: Chain steps</span>
workflow.then(step1).branch(step2, step3);
</code></pre>
<p><strong>Use Cases:</strong> Domain-specific copilots for legal or finance, integrating with Next.js for web apps.</p>
<h3 id="heading-pydanticai">PydanticAI</h3>
<p>Type-safe framework using Pydantic for validation, supporting durable executions.</p>
<p><strong>Implementation Example:</strong> Bank support agent.</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> pydantic_ai <span class="hljs-keyword">import</span> Agent

agent = Agent(<span class="hljs-string">'openai:gpt-5'</span>, instructions=<span class="hljs-string">'Handle queries'</span>)
result = agent.run_sync(<span class="hljs-string">'Check balance'</span>)
</code></pre>
<p><strong>Use Cases:</strong> Reliable agents for banking risk assessment or long-running workflows in e-commerce.</p>
<h3 id="heading-additional-emerging-frameworks">Additional Emerging Frameworks</h3>
<ul>
<li><p><strong>Agno (Phidata):</strong> Fast multi-agent runtime; open source. Example: Teams for research squads. Use: Social media automation.</p>
</li>
<li><p><strong>CrewAI:</strong> Role-based multi-agent teams. Example: Analyst-Editor workflows. Use: Prototyping CX bots.</p>
</li>
<li><p><strong>DSPy:</strong> Optimizes reasoning pipelines. Example: Eval-driven workflows. Use: Experiment-heavy research.</p>
</li>
</ul>
<h2 id="heading-comparing-the-top-open-source-ai-frameworks-pros-cons-best-use-cases-and-scenarios">Comparing the Top Open Source AI Frameworks: Pros, Cons, Best Use Cases, and Scenarios</h2>
<p>To help you decide which framework fits your needs, here's a comparison based on 2026 insights. This table highlights pros, cons, and ideal scenarios, drawing from community benchmarks and expert analyses.</p>
<div class="hn-table">
<table>
<thead>
<tr>
<td>Framework</td><td>Pros</td><td>Cons</td><td>Best Use Cases/Scenarios</td></tr>
</thead>
<tbody>
<tr>
<td>TanStack AI</td><td>Lightweight, type-safe; strong React/Solid integrations; streaming-first.</td><td>Limited to UI-heavy apps; requires assembly of components.</td><td>Real-time chat interfaces; e-commerce recommendations.</td></tr>
<tr>
<td>Vercel AI SDK</td><td>Flexible TypeScript toolkit; deep React/Next.js support; multi-provider LLMs; MCP client.</td><td>Low-level requiring manual setup; no native observability.</td><td>UI-heavy products; interactive human-in-the-loop experiences; dashboard agents.</td></tr>
<tr>
<td>Google ADK</td><td>Modular for resilient architectures; enterprise security; multimodal support; MCP/A2A.</td><td>Python-heavy; Gemini bias; smaller community.</td><td>Multi-agent systems on GCP; compliance-heavy industries; hierarchical compositions.</td></tr>
<tr>
<td>Microsoft Agent Framework</td><td>Unified AutoGen/Semantic Kernel; enterprise governance; multi-language SDKs; strong observability.</td><td>Rapid API changes; documentation lags.</td><td>Azure enterprises; secure multi-agent workflows; fraud detection.</td></tr>
<tr>
<td>OpenAI Agents SDK</td><td>Minimalist with tracing/guardrails; supports TS/Python; ChatKit UI.</td><td>Tied to OpenAI models; high cloud lock-in.</td><td>OpenAI-first agents; low-code ChatGPT integrations; educational tools.</td></tr>
<tr>
<td>LangChain/LangGraph</td><td>Massive ecosystem; graph-based orchestration; exceptional observability via LangSmith.</td><td>Documentation sprawl; complexity for simple tasks; cloud lock-in.</td><td>Complex stateful workflows; automation in enterprises; R&amp;D pipelines.</td></tr>
<tr>
<td>Mastra</td><td>Structured primitives for backend; multi-provider; flexible deployment including managed cloud.</td><td>No built-in UI; moderate lock-in if using cloud.</td><td>TypeScript backend agents; production systems with workflows; self-hosted setups.</td></tr>
<tr>
<td>PydanticAI</td><td>Type-safe with durable execution; excellent IDE support; clear API.</td><td>Newer with smaller ecosystem; Python-only; async quirks.</td><td>Reliable long-running processes; schema-first development; banking assessments.</td></tr>
<tr>
<td>Agno (Phidata)</td><td>Fast performance; multi-modal; built-in memory/UI; minimal code.</td><td>Experimental reasoning; smaller community.</td><td>Production memory-rich agents; collaborative setups; web search with visuals.</td></tr>
<tr>
<td>CrewAI</td><td>Role-based teams; simple API; developer-friendly; robust orchestration.</td><td>Less deterministic; limited security for enterprise.</td><td>Rapid prototyping; CX bots; startups building collaborative systems.</td></tr>
<tr>
<td>DSPy</td><td>Optimizable workflows; eval-driven; high-quality outputs fast.</td><td>Non-transparent execution; not OpenAI-compatible for observability.</td><td>Experiment-heavy research; performance optimization; reasoning pipelines.</td></tr>
</tbody>
</table>
</div><p>This comparison underscores the diversity: choose LangGraph for control, CrewAI for speed, or Microsoft for enterprise security.</p>
<h2 id="heading-why-these-frameworks-are-important">Why These Frameworks Are Important</h2>
<p>In 2026, AI agents are projected to handle 40% of business tasks, driving efficiency and innovation. Open source frameworks democratize access, fostering rapid iteration and community contributions. They reduce vendor lock-in, enhance security through standards like MCP, and enable scalable deployments—from edge computing to cloud enterprises.</p>
<p>For developers, they cut development time by 50-70% via pre-built primitives, while enterprises benefit from cost savings and compliance. In a world where AI ethics and interoperability matter, these tools ensure transparent, adaptable systems. Whether you're a beginner prototyping with CrewAI or an expert scaling with Microsoft Agent Framework, they unlock AI's full potential for smarter, autonomous solutions.</p>
<p>Sources:</p>
<ul>
<li><p><a target="_blank" href="https://softcery.com/lab/top-14-ai-agent-frameworks-of-2025-a-founders-guide-to-building-smarter-systems">Softcery: 14 AI Agent Frameworks Compared</a></p>
</li>
<li><p><a target="_blank" href="https://alphacorp.ai/top-5-ai-agent-frameworks-november-2025/">AlphaCorp AI: Top 5 AI Agent Frameworks (November 2025)</a></p>
</li>
<li><p><a target="_blank" href="https://ai-sdk.dev/docs/introduction">Vercel AI SDK Documentation</a></p>
</li>
<li><p><a target="_blank" href="https://mastra.ai/docs">Mastra Docs</a></p>
</li>
<li><p><a target="_blank" href="https://google.github.io/adk-docs/">Google Agent Development Kit</a></p>
</li>
<li><p><a target="_blank" href="https://www.agno.com/">Agno (Phidata) Official Site</a></p>
</li>
<li><p><a target="_blank" href="https://langwatch.ai/blog/best-ai-agent-frameworks-in-2025-comparing-langgraph-dspy-crewai-agno-and-more">LangWatch: Best AI Agent Frameworks in 2025</a></p>
</li>
<li><p><a target="_blank" href="https://fashn.ai/blog/choosing-the-best-ai-agent-framework-in-2025">FASHN AI: Choosing the Best AI Agent Framework in 2025</a></p>
</li>
<li><p><a target="_blank" href="https://devblogs.microsoft.com/foundry/introducing-microsoft-agent-framework-the-open-source-engine-for-agentic-ai-apps/">Microsoft Agent Framework Announcement</a></p>
</li>
<li><p><a target="_blank" href="https://ai.pydantic.dev/">Pydantic AI Documentation</a></p>
</li>
<li><p><a target="_blank" href="https://openai.github.io/openai-agents-python/">OpenAI Agents SDK Documentation</a></p>
</li>
<li><p><a target="_blank" href="https://tanstack.com/ai/latest/docs">TanStack AI Overview</a></p>
</li>
<li><p><a target="_blank" href="https://www.ampcome.com/post/top-7-ai-agent-frameworks-in-2025">Ampcome: Top 7 AI Agent Frameworks in 2025</a></p>
</li>
<li><p><a target="_blank" href="https://www.vellum.ai/blog/top-ai-agent-frameworks-for-developers">The Best AI Agent Frameworks For Developers - Vellum AI</a></p>
</li>
<li><p><a target="_blank" href="https://research.aimultiple.com/agentic-frameworks/">Top 5 Open-Source Agentic Frameworks in 2026</a></p>
</li>
<li><p><a target="_blank" href="https://medium.com/%40Deep-concept/top-10-open-source-ai-agent-frameworks-to-know-in-2026-91395d47ba12">Top 10 open-Source AI Agent Frameworks to Know in 2026…</a></p>
</li>
<li><p><a target="_blank" href="https://www.datacamp.com/blog/best-ai-agents">The Best AI Agents in 2026: Tools, Frameworks, and Platforms ...</a></p>
</li>
<li><p><a target="_blank" href="https://relipa.global/ai-agent-frameworks/">AI Agent Frameworks In 2026: Which One Fits Your Use Case?</a></p>
</li>
<li><p><a target="_blank" href="https://www.excellentwebworld.com/best-ai-frameworks/">What Are the Top AI Frameworks in 2026? Pros, Cons &amp; Use Cases</a></p>
</li>
<li><p><a target="_blank" href="https://www.analyticsvidhya.com/blog/2024/07/ai-agent-frameworks/">Top 7 Frameworks for Building AI Agents in 2026 - Analytics Vidhya</a></p>
</li>
<li><p><a target="_blank" href="https://www.instaclustr.com/education/open-source-ai/agentic-ai-frameworks-top-8-options-in-2026/">Agentic AI Frameworks: Top 8 Options in 2026 - NetApp Instaclustr</a></p>
</li>
<li><p><a target="_blank" href="https://www.reddit.com/r/AI_Agents/comments/1hq9il6/best_ai_agent_frameworks_in_2025_a_comprehensive/">Best AI Agent Frameworks in 2025: A Comprehensive Guide - Reddit</a></p>
</li>
<li><p><a target="_blank" href="https://sthenostechnologies.com/blogs/best-ai-agent-frameworks/">9 Best AI Agent Frameworks For 2026: A Developer's Guide</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Day 4/100: State of Offline On-Device AI in 2025 & beyond]]></title><description><![CDATA[Day 4/100: Intro to Offline On-Device AI
What is Offline On-Device AI?
Offline on-device AI refers to artificial intelligence systems that run entirely on local hardware, such as smartphones, laptops, or edge devices, without needing an internet conn...]]></description><link>https://blog.karanbalaji.com/100-days-of-ai-day-4-offline-on-device-ai-in-2025-and-beyond</link><guid isPermaLink="true">https://blog.karanbalaji.com/100-days-of-ai-day-4-offline-on-device-ai-in-2025-and-beyond</guid><category><![CDATA[AI]]></category><category><![CDATA[Artificial Intelligence]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[technology]]></category><category><![CDATA[tech ]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Wed, 31 Dec 2025 03:45:45 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1767152606119/853b2bf5-873f-4a06-8734-4e83e0b9041c.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-day-4100-intro-to-offline-on-device-ai">Day 4/100: Intro to Offline On-Device AI</h1>
<h2 id="heading-what-is-offline-on-device-ai">What is Offline On-Device AI?</h2>
<p>Offline on-device AI refers to artificial intelligence systems that run entirely on local hardware, such as smartphones, laptops, or edge devices, without needing an internet connection or cloud servers. This approach processes data and generates outputs directly on the device, emphasizing privacy, low latency, and reliability. By the end of 2025, on-device AI has matured significantly, driven by advancements in hardware like neural processing units (NPUs) and efficient models tailored for constrained environments. Models now handle complex tasks like multimodal processing, which requires substantial memory but operates within device limits, often around 6GB peak usage.</p>
<p>The current state as of late 2025 shows widespread adoption across industries. For instance, AI models like Llama 3.1 70B serve as generalist benchmarks, while specialized ones like DeepSeek R1 enable advanced reasoning on smaller devices. Companies such as Apple, Google, and Qualcomm have optimized runtimes, allowing GPT-style models to run locally on everyday hardware. Offline AI chat apps are booming in sectors like tourism and e-commerce, where models like Gemini Nano manage 1.5 billion parameters for seamless interactions in low-connectivity areas. New devices, such as Holiverse's offline AI hardware, embed private AI to keep data control with users, eliminating external server dependencies.</p>
<p>Looking ahead, the future of offline on-device AI points to hybrid systems by 2026 and beyond, blending local and cloud processing for optimal performance. Trends include context-aware AI in wearables and earbuds, enabling always-on intelligence with offline reliability. Mobile apps will shift more processing to the edge, reducing latency and enhancing privacy, with projections that 90% of new apps will incorporate on-device AI capabilities. An "offline renaissance" may emerge, prioritizing local compute amid declining social media usage and rising voice-first tech.</p>
<h3 id="heading-the-state-of-offline-ai-in-2025">The State of Offline AI in 2025</h3>
<p>As of end-2025, offline AI is underserved yet scaling rapidly, with the global AI market projected to grow from USD 233.46 billion in 2024. Challenges include hardware-software gaps, where models grow faster than device capabilities, but solutions like NPUs in laptops address this by enabling local runs of large language models (LLMs). Decentralized infrastructure and open-source fine-tuning are key trends, fostering specialized models for edge use.</p>
<h3 id="heading-the-open-source-framework-stack-for-on-device-ai">The Open-Source Framework Stack for On-Device AI</h3>
<p>Open-source frameworks form the backbone of on-device AI development. TensorFlow Lite and PyTorch Mobile lead for mobile inference, while ONNX Runtime ensures model portability across devices. Other notables include Hugging Face Transformers for model access, Scikit-learn for lightweight ML, and specialized tools like NNStreamer and NNTrainer for on-device learning and reasoning. Repositories like Awesome LLMs on Device curate resources for running large models locally. Emerging frameworks like DroidRun enable AI agents on Android, automating workflows directly on hardware.</p>
<h3 id="heading-the-offline-on-device-ai-ecosystem-explained">The Offline On-Device AI Ecosystem Explained</h3>
<p>The ecosystem spans compilers, runtimes, optimization tools, and local servers, creating a full pipeline for efficient edge AI.</p>
<h4 id="heading-compilers-in-on-device-ai">Compilers in On-Device AI</h4>
<p>Tools like ML compilers optimize models for edge deployment, making them faster and more secure. Techniques include operation fusion to combine operations, reducing memory use.</p>
<h4 id="heading-runtimes-for-efficient-inference">Runtimes for Efficient Inference</h4>
<p>Frameworks such as TensorFlow Lite or ONNX Runtime handle inference on devices, supporting quantization for smaller footprints.</p>
<h4 id="heading-optimization-techniques">Optimization Techniques</h4>
<p>Methods like pruning (removing unnecessary parameters) and quantization (reducing precision) enable models to fit on limited hardware. Samsung's optimizations allow large AI models to run efficiently on-device.</p>
<h4 id="heading-local-servers-for-privacy-focused-ai">Local Servers for Privacy-Focused AI</h4>
<p>Tools for serving models locally, such as Ollama or ML Drift, facilitate GPU-accelerated inference without cloud reliance, ideal for privacy-focused applications.</p>
<h3 id="heading-googles-coral-npu-a-full-stack-open-source-platform-for-edge-ai">Google's Coral NPU: A Full-Stack Open-Source Platform for Edge AI</h3>
<p>Google's Coral NPU, launched in October 2025, is a groundbreaking full-stack, open-source platform designed to tackle performance, fragmentation, and privacy issues in edge AI, particularly for low-power devices and wearables. It enables always-on AI by embedding intelligence directly into personal devices, ensuring data privacy and operational efficiency.</p>
<h4 id="heading-key-features-and-architecture">Key Features and Architecture</h4>
<p>The Coral NPU features an AI-first architecture built on RISC-V ISA-compliant IP blocks, including a scalar core for lightweight C-programmable tasks, a vector execution unit for SIMD operations compliant with RISC-V Vector v1.0, and a matrix execution unit for quantized neural network computations. This design prioritizes the ML matrix engine over traditional scalar compute, optimizing for machine learning workloads. It delivers 512 GOPS of performance while consuming just a few milliwatts, making it ideal for battery-constrained devices like smartwatches and AR glasses.</p>
<h4 id="heading-open-source-aspects-and-availability">Open-Source Aspects and Availability</h4>
<p>Fully open and extensible, the platform includes documentation and tools available at <a target="_blank" href="http://developers.google.com/coral">developers.google.com/coral</a>, with the matrix core set for GitHub release later in 2025. It supports open standards and collaborations, such as with Synaptics on their Torq NPU subsystem.</p>
<h4 id="heading-integration-and-software-stack">Integration and Software Stack</h4>
<p>Coral NPU integrates seamlessly with compilers like IREE and TFLM, and frameworks including TensorFlow, JAX, and PyTorch. The software toolchain features an MLIR compiler, custom kernels, and a simulator, processing models through StableHLO for optimized binaries. In the broader ecosystem, it acts as a hardware-software bridge, enhancing runtimes and optimizations for IoT and wearables, accelerating applications like ambient sensing, audio processing, and gesture control.</p>
<h4 id="heading-partnerships-and-edge-ai-support">Partnerships and Edge AI Support</h4>
<p>Co-developed with Google Research and DeepMind, it partners with Synaptics for production in Astra SL2610 processors. It supports hardware-enforced privacy via CHERI and targets complex models like small transformers, bringing LLMs to edge devices.</p>
<h2 id="heading-how-to-implement-offline-on-device-ai">How to Implement Offline On-Device AI</h2>
<p>Implementation starts with selecting frameworks and optimizing models for target hardware.</p>
<h3 id="heading-practical-examples-and-use-cases">Practical Examples and Use Cases</h3>
<h4 id="heading-mobile-app-integration-example">Mobile App Integration Example</h4>
<p>Use TensorFlow Lite to deploy a model for offline image recognition. Download a pre-trained model from Hugging Face, quantize it, and integrate via code:</p>
<pre><code class="lang-python"><span class="hljs-keyword">import</span> tensorflow <span class="hljs-keyword">as</span> tf

<span class="hljs-comment"># Load quantized model</span>
interpreter = tf.lite.Interpreter(model_path=<span class="hljs-string">"model.tflite"</span>)
interpreter.allocate_tensors()

<span class="hljs-comment"># Run inference on input data</span>
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
interpreter.set_tensor(input_details[<span class="hljs-number">0</span>][<span class="hljs-string">'index'</span>], input_data)
interpreter.invoke()
output = interpreter.get_tensor(output_details[<span class="hljs-number">0</span>][<span class="hljs-string">'index'</span>])
</code></pre>
<p>This enables real-time object detection on smartphones, useful for augmented reality apps.</p>
<h4 id="heading-iot-device-setup-with-coral-npu">IoT Device Setup with Coral NPU</h4>
<p>With Coral NPU, compile a model using its tools for edge deployment. For a smart camera, optimize for low power and run inference locally, processing video feeds without cloud uploads. Use case: Security systems in remote areas.</p>
<h4 id="heading-hybrid-workflow-example">Hybrid Workflow Example</h4>
<p>Train in the cloud, then deploy on-device via ONNX. Optimize with pruning tools to reduce size by 50%, enabling offline chatbots for e-commerce in spotty networks.</p>
<p>Tools like Apple's MLX framework support distributed workloads on M-series chips, ideal for clustering devices like Mac Minis for local AI farms.</p>
<h2 id="heading-why-offline-on-device-ai-is-important">Why Offline On-Device AI is Important</h2>
<p>Offline on-device AI is crucial for privacy, as data never leaves the device, reducing risks of breaches. It cuts latency for real-time applications like autonomous agents and enhances reliability in offline scenarios. Economically, it lowers costs by avoiding cloud fees, and environmentally, it promotes efficient compute. As AI commoditizes, on-device focus, like Apple's strategy, positions it as a competitive edge in a hybrid future.</p>
<p>Sources:</p>
<ul>
<li><p><a target="_blank" href="https://www.f22labs.com/blogs/what-is-on-device-ai-a-complete-guide/">https://www.f22labs.com/blogs/what-is-on-device-ai-a-complete-guide/</a></p>
</li>
<li><p><a target="_blank" href="https://etcjournal.com/2025/10/25/five-emerging-ai-trends-in-late-october-2025/">https://etcjournal.com/2025/10/25/five-emerging-ai-trends-in-late-october-2025/</a></p>
</li>
<li><p><a target="_blank" href="https://tech.yahoo.com/ai/articles/state-ai-2025-google-apple-154700222.html">https://tech.yahoo.com/ai/articles/state-ai-2025-google-apple-154700222.html</a></p>
</li>
<li><p><a target="_blank" href="https://www.nimbleedge.com/blog/state-of-on-device-ai">https://www.nimbleedge.com/blog/state-of-on-device-ai</a></p>
</li>
<li><p><a target="_blank" href="https://medium.com/design-bootcamp/on-device-ai-ux-in-2025-ship-an-honest-hybrid-8e450ca7fd4d">https://medium.com/design-bootcamp/on-device-ai-ux-in-2025-ship-an-honest-hybrid-8e450ca7fd4d</a></p>
</li>
<li><p><a target="_blank" href="https://www.mindkeep.ai/blogs/post/best-offline-ai-models">https://www.mindkeep.ai/blogs/post/best-offline-ai-models</a></p>
</li>
<li><p><a target="_blank" href="https://a-bots.com/blog/Offline-AI-Chat-Apps-Development">https://a-bots.com/blog/Offline-AI-Chat-Apps-Development</a></p>
</li>
<li><p><a target="_blank" href="https://www.abc27.com/business/press-releases/globenewswire/9616357/holiverse-announces-development-of-offline-ai-device-to-return-data-control-to-users">https://www.abc27.com/business/press-releases/globenewswire/9616357/holiverse-announces-development-of-offline-ai-device-to-return-data-control-to-users</a></p>
</li>
<li><p><a target="_blank" href="https://spectrum.ieee.org/ai-models-locally">https://spectrum.ieee.org/ai-models-locally</a></p>
</li>
<li><p><a target="_blank" href="https://www.digitalbricks.ai/blog-posts/ai-progress-in-2025-whats-happened-and-whats-next">https://www.digitalbricks.ai/blog-posts/ai-progress-in-2025-whats-happened-and-whats-next</a></p>
</li>
<li><p><a target="_blank" href="https://siliconsandstudio.substack.com/p/tech-extra-ai-predictions-for-2026">https://siliconsandstudio.substack.com/p/tech-extra-ai-predictions-for-2026</a></p>
</li>
<li><p><a target="_blank" href="https://www.youtube.com/watch?v=rTitZu_0UxM">https://www.youtube.com/watch?v=rTitZu_0UxM</a></p>
</li>
<li><p><a target="_blank" href="https://www.yankodesign.com/2025/12/19/how-ai-will-be-different-at-ces-2026-on-device-processing-and-actual-agentic-productivity/">https://www.yankodesign.com/2025/12/19/how-ai-will-be-different-at-ces-2026-on-device-processing-and-actual-agentic-productivity/</a></p>
</li>
<li><p><a target="_blank" href="https://www.forbes.com/sites/timbajarin/2025/12/29/ces-2026-preview-the-year-tech-gets-ai-context-aware/">https://www.forbes.com/sites/timbajarin/2025/12/29/ces-2026-preview-the-year-tech-gets-ai-context-aware/</a></p>
</li>
<li><p><a target="_blank" href="https://medium.com/%40aleaitsolutions/the-future-of-ai-in-mobile-app-development-trends-to-watch-in-2026-33de64e46b6d">https://medium.com/%40aleaitsolutions/the-future-of-ai-in-mobile-app-development-trends-to-watch-in-2026-33de64e46b6d</a></p>
</li>
<li><p><a target="_blank" href="https://www.linkedin.com/posts/sinead-bovell-89072a34_2026-may-mark-the-year-of-the-offline-renaissance-activity-7407818380868632577-XM_6">https://www.linkedin.com/posts/sinead-bovell-89072a34_2026-may-mark-the-year-of-the-offline-renaissance-activity-7407818380868632577-XM_6</a></p>
</li>
<li><p><a target="_blank" href="https://aiwithallie.beehiiv.com/p/my-2026-ai-predictions-and-the-three-things-you-need-to-focus-on">https://aiwithallie.beehiiv.com/p/my-2026-ai-predictions-and-the-three-things-you-need-to-focus-on</a></p>
</li>
<li><p><a target="_blank" href="https://www.dotcominfoway.com/blog/mobile-app-development-trends-2026-on-device-ai-edge-and-beyond/">https://www.dotcominfoway.com/blog/mobile-app-development-trends-2026-on-device-ai-edge-and-beyond/</a></p>
</li>
<li><p><a target="_blank" href="https://www.novusasi.com/blog/the-rise-of-local-ai-models-going-small-to-go-big">https://www.novusasi.com/blog/the-rise-of-local-ai-models-going-small-to-go-big</a></p>
</li>
<li><p><a target="_blank" href="https://www.youtube.com/watch?v=agU3N6ie0lI">https://www.youtube.com/watch?v=agU3N6ie0lI</a></p>
</li>
<li><p><a target="_blank" href="https://www.digitalocean.com/resources/articles/open-source-ai-platforms">https://www.digitalocean.com/resources/articles/open-source-ai-platforms</a></p>
</li>
<li><p><a target="_blank" href="https://www.atlantic.net/gpu-server-hosting/top-open-source-ai-ml-frameworks-in-2025/">https://www.atlantic.net/gpu-server-hosting/top-open-source-ai-ml-frameworks-in-2025/</a></p>
</li>
<li><p><a target="_blank" href="https://github.com/NexaAI/Awesome-LLMs-on-device">https://github.com/NexaAI/Awesome-LLMs-on-device</a></p>
</li>
<li><p><a target="_blank" href="https://www.reddit.com/r/opensource/comments/1m6btnb/we_just_opensourced_the_first_mobile_ai_agent/">https://www.reddit.com/r/opensource/comments/1m6btnb/we_just_opensourced_the_first_mobile_ai_agent/</a></p>
</li>
<li><p><a target="_blank" href="https://greennode.ai/blog/best-open-source-ai-platforms">https://greennode.ai/blog/best-open-source-ai-platforms</a></p>
</li>
<li><p><a target="_blank" href="https://www.instaclustr.com/education/open-source-ai/top-10-open-source-llms-for-2025/">https://www.instaclustr.com/education/open-source-ai/top-10-open-source-llms-for-2025/</a></p>
</li>
<li><p><a target="_blank" href="https://www.ibm.com/think/insights/open-source-ai-tools">https://www.ibm.com/think/insights/open-source-ai-tools</a></p>
</li>
<li><p><a target="_blank" href="https://developer.samsung.com/conference/sdc23/sessions/open-source-ondevice-ai-sw-platform-for-optimized-executions-personalization-pipelines-and-mlops">https://developer.samsung.com/conference/sdc23/sessions/open-source-ondevice-ai-sw-platform-for-optimized-executions-personalization-pipelines-and-mlops</a></p>
</li>
<li><p><a target="_blank" href="https://blog.huebits.in/top-10-edge-ai-frameworks-for-2025-best-tools-for-real-time-on-device-machine-learning/">https://blog.huebits.in/top-10-edge-ai-frameworks-for-2025-best-tools-for-real-time-on-device-machine-learning/</a></p>
</li>
<li><p><a target="_blank" href="https://anshadameenza.com/blog/technology/2025-12-08-on-device-hybrid-architectures-edge-ai/">https://anshadameenza.com/blog/technology/2025-12-08-on-device-hybrid-architectures-edge-ai/</a></p>
</li>
<li><p><a target="_blank" href="https://www.cognativ.com/blogs/post/discovering-local-ai-solutions-guide-to-efficient-private-tech/267">https://www.cognativ.com/blogs/post/discovering-local-ai-solutions-guide-to-efficient-private-tech/267</a></p>
</li>
<li><p><a target="_blank" href="https://valerelabs.medium.com/edge-ai-the-rise-of-on-device-ai-8e6348bea620">https://valerelabs.medium.com/edge-ai-the-rise-of-on-device-ai-8e6348bea620</a></p>
</li>
<li><p><a target="_blank" href="https://latentai.com/blog/ai-model-compilation-for-high-performance-models-at-the-edge/">https://latentai.com/blog/ai-model-compilation-for-high-performance-models-at-the-edge/</a></p>
</li>
<li><p><a target="_blank" href="https://huyenchip.com/2021/09/07/a-friendly-introduction-to-machine-learning-compilers-and-optimizers.html">https://huyenchip.com/2021/09/07/a-friendly-introduction-to-machine-learning-compilers-and-optimizers.html</a></p>
</li>
<li><p><a target="_blank" href="https://semiconductor.samsung.com/news-events/tech-blog/samsungs-pivotal-role-in-pioneering-on-device-generative-ai/">https://semiconductor.samsung.com/news-events/tech-blog/samsungs-pivotal-role-in-pioneering-on-device-generative-ai/</a></p>
</li>
<li><p><a target="_blank" href="https://arxiv.org/html/2503.06027v1">https://arxiv.org/html/2503.06027v1</a></p>
</li>
<li><p><a target="_blank" href="https://www.mobilint.com/post/how-local-llms-transform-ai-strategy">https://www.mobilint.com/post/how-local-llms-transform-ai-strategy</a></p>
</li>
<li><p><a target="_blank" href="https://harvard-edge.github.io/cs249r_book/contents/core/frameworks/frameworks.html">https://harvard-edge.github.io/cs249r_book/contents/core/frameworks/frameworks.html</a></p>
</li>
<li><p><a target="_blank" href="https://www.interactivesilicon.ai/news/ml-drift-on-device-generative-ai-impact-and-adoption/">https://www.interactivesilicon.ai/news/ml-drift-on-device-generative-ai-impact-and-adoption/</a></p>
</li>
<li><p><a target="_blank" href="https://developers.googleblog.com/introducing-coral-npu-a-full-stack-platform-for-edge-ai/">https://developers.googleblog.com/introducing-coral-npu-a-full-stack-platform-for-edge-ai/</a></p>
</li>
<li><p><a target="_blank" href="https://developers.google.com/coral/guides/intro">https://developers.google.com/coral/guides/intro</a></p>
</li>
<li><p><a target="_blank" href="https://www.linkedin.com/pulse/coral-npu-full-stack-platform-edge-ai-billy-rutledge-dfwkc">https://www.linkedin.com/pulse/coral-npu-full-stack-platform-edge-ai-billy-rutledge-dfwkc</a></p>
</li>
<li><p><a target="_blank" href="https://www.infoq.com/news/2025/10/google-coral-npu-platform/">https://www.infoq.com/news/2025/10/google-coral-npu-platform/</a></p>
</li>
<li><p><a target="_blank" href="https://howaiworks.ai/blog/google-coral-npu-announcement-2025">https://howaiworks.ai/blog/google-coral-npu-announcement-2025</a></p>
</li>
<li><p><a target="_blank" href="https://eu.36kr.com/en/p/3511522467601541">https://eu.36kr.com/en/p/3511522467601541</a></p>
</li>
<li><p><a target="_blank" href="https://www.cnx-software.com/2025/10/17/google-open-source-coral-npu-synaptics-sl2610-edge-ai-socs/">https://www.cnx-software.com/2025/10/17/google-open-source-coral-npu-synaptics-sl2610-edge-ai-socs/</a></p>
</li>
<li><p><a target="_blank" href="https://www.youtube.com/watch?v=dZVs5u1urc0">https://www.youtube.com/watch?v=dZVs5u1urc0</a></p>
</li>
<li><p><a target="_blank" href="https://developers.google.com/coral">https://developers.google.com/coral</a></p>
</li>
<li><p><a target="_blank" href="https://www.design-reuse.com/news/202529652-verisilicon-and-google-jointly-launch-open-source-coral-npu-ip/">https://www.design-reuse.com/news/202529652-verisilicon-and-google-jointly-launch-open-source-coral-npu-ip/</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/Joelc_eth/status/2006082411549888684">https://x.com/Joelc_eth/status/2006082411549888684</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/shakeorgetbaked/status/2006075351651528832">https://x.com/shakeorgetbaked/status/2006075351651528832</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/devpost/status/2006064500454097123">https://x.com/devpost/status/2006064500454097123</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/TrevonAI/status/2005987813708431542">https://x.com/TrevonAI/status/2005987813708431542</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/robinsnewswire/status/2005643513279521063">https://x.com/robinsnewswire/status/2005643513279521063</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/GChkhvirkia/status/2005560772332749053">https://x.com/GChkhvirkia/status/2005560772332749053</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/X_Gen_info/status/2005428591316902395">https://x.com/X_Gen_info/status/2005428591316902395</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/BullLord337/status/2005166825257288092">https://x.com/BullLord337/status/2005166825257288092</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/SeanMcD29465784/status/2004693420707569979">https://x.com/SeanMcD29465784/status/2004693420707569979</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/lmanchu/status/2004613384080355714">https://x.com/lmanchu/status/2004613384080355714</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/trymirai/status/2004220598650536150">https://x.com/trymirai/status/2004220598650536150</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/Noto_Workspace/status/2003876929666273597">https://x.com/Noto_Workspace/status/2003876929666273597</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/TonyeJonat13016/status/2003782934965239846">https://x.com/TonyeJonat13016/status/2003782934965239846</a></p>
</li>
<li><p><a target="_blank" href="https://x.com/OwnArnab/status/2003164159995969800">https://x.com/OwnArnab/status/2003164159995969800</a></p>
</li>
<li><p><a target="_blank" href="https://developers.googleblog.com/en/introducing-coral-npu-a-full-stack-platform-for-edge-ai/">https://developers.googleblog.com/en/introducing-coral-npu-a-full-stack-platform-for-edge-ai/</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Day 3/100: AI SDK 6: Revolutionizing AI Application Development]]></title><description><![CDATA[Intro to Vercel’s AI SDK
As part of the #100DaysOfAi series, we're diving into the latest advancements in AI tools. Vercel has just launched AI SDK 6, a significant update to their TypeScript toolkit designed for building AI-powered applications. Thi...]]></description><link>https://blog.karanbalaji.com/day-3100-ai-sdk-6-revolutionizing-ai-application-development</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-3100-ai-sdk-6-revolutionizing-ai-application-development</guid><category><![CDATA[AI]]></category><category><![CDATA[Artificial Intelligence]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[coding]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Mon, 22 Dec 2025 22:59:13 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766443929009/98bed4a8-c56f-4824-866a-98024c2e327f.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-intro-to-vercels-ai-sdk">Intro to Vercel’s AI SDK</h1>
<p>As part of the #100DaysOfAi series, we're diving into the latest advancements in AI tools. Vercel has just launched AI SDK 6, a significant update to their TypeScript toolkit designed for building AI-powered applications. This release builds on the SDK's foundation, offering enhanced capabilities for developers working with large language models (LLMs) and integrating AI seamlessly into web and server-side projects. With over 20 million monthly downloads, it's a go-to resource for creating everything from simple chatbots to sophisticated agents.</p>
<p>The update emphasizes composability, safety, and efficiency, making it particularly valuable for generative UI experiences where real-time interactions and dynamic content generation are key. We've leveraged it extensively in projects involving generative UI, where streaming responses and tool integrations create fluid, interactive interfaces. Additionally, it pairs well with AI Elements, an open-source library of customizable React components built on shadcn/ui, specifically for constructing chat interfaces and workflows. AI Elements provides prebuilt, headless-first UI primitives like conversations and messages, enabling developers to craft professional chat UIs similar to those in ChatGPT or Claude, handling complexities such as streaming and tool rendering effortlessly.</p>
<h2 id="heading-what-is-ai-sdk-6-about">What is AI SDK 6 About?</h2>
<p>AI SDK 6 is the sixth major version of Vercel's AI SDK, a unified TypeScript API for connecting to various AI providers like OpenAI, Anthropic, Google, and xAI. It simplifies building AI applications across frameworks such as Next.js, React, Svelte, Vue, and Node.js. The core focus is on enabling reusable agents, advanced tool handling, and production-ready features that support complex workflows.</p>
<h3 id="heading-key-highlights-and-new-features">Key Highlights and New Features</h3>
<p>This release packs over a dozen enhancements, drawing from community feedback and real-world use cases. Here's a breakdown:</p>
<ul>
<li><p><strong>Agents Abstraction</strong>: Introduces composable agents that can be defined once and reused in UIs, jobs, or APIs. The ToolLoopAgent handles automated loops for LLM calls, tool executions, and iterations, with configurable stops like after 20 steps.</p>
</li>
<li><p><strong>Tool Enhancements</strong>:</p>
<ul>
<li><p><strong>Execution Approval (Human-in-the-Loop)</strong>: Flags sensitive tools for manual review, integrating with UIs via hooks like useChat to prompt users.</p>
</li>
<li><p><strong>Strict Mode</strong>: Ensures input schemas match exactly, preventing failures.</p>
</li>
<li><p><strong>Input Examples</strong>: Guides models with concrete schema examples for better alignment.</p>
</li>
<li><p><strong>toModelOutput Function</strong>: Separates full outputs from model-visible data to optimize tokens.</p>
</li>
</ul>
</li>
<li><p><strong>Model Context Protocol (MCP) Support</strong>: Now stable, with HTTP transport, OAuth, resources for data exposure, reusable prompts, and elicitation for mid-process user input.</p>
</li>
<li><p><strong>Structured Output with Tool Calling</strong>: Unifies generation functions to support multi-step loops ending in objects, arrays, choices, JSON, or text.</p>
</li>
<li><p><strong>DevTools</strong>: A debugging interface for inspecting agent flows, including prompts, outputs, and metrics. Launch it with a simple command for real-time visibility.</p>
</li>
<li><p><strong>Reranking</strong>: Reorders search results for relevance using providers like Cohere, improving context for models.</p>
</li>
<li><p><strong>Standard JSON Schema</strong>: Compatible with any schema library following the V1 standard, eliminating custom converters.</p>
</li>
<li><p><strong>Image Editing</strong>: Extends image generation to include reference images for edits, like transforming one image based on a prompt.</p>
</li>
<li><p><strong>Raw Finish Reasons and Extended Usage</strong>: Provides provider-specific stop reasons and detailed token breakdowns for better optimization.</p>
</li>
<li><p><strong>LangChain Adapter Rewrite</strong>: Supports modern LangChain features, including streams, interrupts, and browser-side connections.</p>
</li>
<li><p><strong>Provider-Specific Tools</strong>: Expands with tools like memory and code execution for Anthropic, shell and patch for OpenAI, maps and RAG for Google, and search/code for xAI.</p>
</li>
</ul>
<p>Breaking changes are minimal, with an automated codemod for migration. The SDK now aligns with the v3 Language Model Specification, enhancing agent and tool capabilities.</p>
<h2 id="heading-how-to-implement-ai-sdk-6">How to Implement AI SDK 6</h2>
<p>Implementing AI SDK 6 starts with installation via npm: <code>npm install ai</code>. For upgrades, use the migration codemod: <code>npx @ai-sdk/codemod upgrade</code>.</p>
<h3 id="heading-practical-examples-and-use-cases">Practical Examples and Use Cases</h3>
<h4 id="heading-building-an-agent">Building an Agent</h4>
<p>Define an agent with a model, instructions, and tools:</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">import</span> { createAgent } <span class="hljs-keyword">from</span> <span class="hljs-string">'ai'</span>;
<span class="hljs-keyword">import</span> { openai } <span class="hljs-keyword">from</span> <span class="hljs-string">'@ai-sdk/openai'</span>;

<span class="hljs-keyword">const</span> agent = createAgent({
  model: openai(<span class="hljs-string">'gpt-4o'</span>),
  instructions: <span class="hljs-string">'You are a helpful assistant.'</span>,
  tools: [<span class="hljs-comment">/* your tools here */</span>],
});

<span class="hljs-keyword">const</span> result = <span class="hljs-keyword">await</span> agent.generate({ prompt: <span class="hljs-string">'User query here'</span> });
</code></pre>
<p>In generative UI, stream responses using <code>stream</code> for real-time updates in a chat interface built with AI Elements components.</p>
<h4 id="heading-tool-execution-with-approval">Tool Execution with Approval</h4>
<p>For safety in production:</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">const</span> deleteFileTool = {
  name: <span class="hljs-string">'deleteFile'</span>,
  description: <span class="hljs-string">'Delete a file'</span>,
  parameters: z.object({ path: z.string() }),
  needsApproval: <span class="hljs-literal">true</span>, <span class="hljs-comment">// Requires human review</span>
  execute: <span class="hljs-keyword">async</span> ({ path }) =&gt; { <span class="hljs-comment">/* deletion logic */</span> },
};
</code></pre>
<p>Integrate with useChat hook in React for UI prompts.</p>
<h4 id="heading-reranking-for-better-search">Reranking for Better Search</h4>
<p>Improve RAG pipelines:</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">import</span> { rerank } <span class="hljs-keyword">from</span> <span class="hljs-string">'ai'</span>;

<span class="hljs-keyword">const</span> rerankedDocs = <span class="hljs-keyword">await</span> rerank({
  model: cohere(<span class="hljs-string">'rerank-english-v3.0'</span>),
  query: <span class="hljs-string">'Best AI tools'</span>,
  documents: [<span class="hljs-comment">/* list of docs */</span>],
});
</code></pre>
<p>Use case: In a sales agent like Claygent, rerank web-scraped data for targeted insights.</p>
<h4 id="heading-image-editing">Image Editing</h4>
<p>Generate edited images:</p>
<pre><code class="lang-typescript"><span class="hljs-keyword">import</span> { generateImage } <span class="hljs-keyword">from</span> <span class="hljs-string">'ai'</span>;

<span class="hljs-keyword">const</span> image = <span class="hljs-keyword">await</span> generateImage({
  model: openai(<span class="hljs-string">'dall-e-3'</span>),
  prompt: <span class="hljs-string">'Two tanukis on a date'</span>,
  referenceImages: [<span class="hljs-string">'url-to-original-image'</span>],
});
</code></pre>
<p>Ideal for generative UI where users iteratively refine visuals.</p>
<p>For chat interfaces, combine with AI Elements:</p>
<pre><code class="lang-tsx">import { Conversation, Message } from '@vercel/ai-elements';

function ChatUI() {
  return (
    &lt;Conversation&gt;
      &lt;Message role="user"&gt;Hello!&lt;/Message&gt;
      {/* Stream AI responses here */}
    &lt;/Conversation&gt;
  );
}
</code></pre>
<p>This setup handles streaming, tool calls, and persistence seamlessly.</p>
<h2 id="heading-why-ai-sdk-6-is-important">Why AI SDK 6 is Important</h2>
<p>AI SDK 6 addresses critical pain points in AI development, such as scalability, safety, and interoperability. By introducing agents and human-in-the-loop features, it reduces risks in production environments, making AI more reliable for enterprises like Thomson Reuters, who built CoCounsel rapidly. Its focus on type safety and reusability accelerates development, cutting time from months to weeks.</p>
<p>In the broader AI landscape, it empowers generative UI by enabling dynamic, interactive experiences that feel natural and responsive. Paired with AI Elements, it democratizes building advanced chat systems, fostering innovation in workflows and assistants. As AI adoption grows, tools like this ensure developers can integrate cutting-edge models without vendor lock-in, driving efficiency and creativity in the #100DaysOfAi journey.</p>
<p>SEO title: AI SDK 6: Key Features &amp; Guide</p>
<p>SEO description: Explore Vercel's AI SDK 6 with agents, tools, and generative UI tips. Learn implementation, highlights, and why it's essential for AI apps.</p>
<p>Sources:</p>
<ul>
<li><p><a target="_blank" href="https://vercel.com/blog/ai-sdk-6">https://vercel.com/blog/ai-sdk-6</a></p>
</li>
<li><p><a target="_blank" href="https://vercel.com/changelog/introducing-ai-elements">https://vercel.com/changelog/introducing-ai-elements</a></p>
</li>
<li><p><a target="_blank" href="https://vercel.com/academy/ai-sdk/ai-elements">https://vercel.com/academy/ai-sdk/ai-elements</a></p>
</li>
<li><p><a target="_blank" href="https://ai-sdk.dev/docs/ai-sdk-ui">https://ai-sdk.dev/docs/ai-sdk-ui</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Day 2/100: Revolutionizing AI Agents with A2UI and Interactions API]]></title><description><![CDATA[Intro to Google’s AI releases
In this installment of the #100DaysOfAi series, we dive into two groundbreaking advancements from Google that are reshaping how AI agents interact with users and developers. A2UI, an open-source protocol for agent-driven...]]></description><link>https://blog.karanbalaji.com/day-2100-revolutionizing-ai-agents-with-a2ui-and-interactions-api</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-2100-revolutionizing-ai-agents-with-a2ui-and-interactions-api</guid><category><![CDATA[AI]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Artificial Intelligence]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Mon, 22 Dec 2025 22:39:59 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766443128103/77f5da7d-ad7f-4948-af3d-32d99fe3de64.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-intro-to-googles-ai-releases">Intro to Google’s AI releases</h1>
<p>In this installment of the #100DaysOfAi series, we dive into two groundbreaking advancements from Google that are reshaping how AI agents interact with users and developers. A2UI, an open-source protocol for agent-driven user interfaces, and the Interactions API, a unified foundation for models and agents, represent key steps toward more dynamic, secure, and efficient AI systems. Drawing from recent announcements and developments as of December 2025, we'll explore these technologies through their core concepts, implementation strategies, and broader significance.</p>
<h2 id="heading-what-these-concepts-are-about">What These Concepts Are About</h2>
<p>A2UI stands for Agent to UI, a declarative protocol designed to let AI agents generate rich, interactive user interfaces without executing arbitrary code. Launched by Google with contributions from the open-source community, including CopilotKit, it addresses a core challenge in AI: safely transmitting complex UIs across trust boundaries. Instead of plain text responses or risky code, agents send structured JSON descriptions of components, which clients render natively on web, mobile, or desktop platforms. Currently in version 0.8 and public preview, A2UI is Apache 2.0 licensed and hosted on GitHub. It emphasizes security by using pre-approved component catalogs, preventing UI injection attacks, and supports progressive rendering for real-time updates.</p>
<p>The Interactions API, introduced by Google DeepMind on December 11, 2025, provides a single RESTful endpoint for seamless access to Gemini models like Gemini 3 Pro and agents such as Gemini Deep Research. Available in public beta via the Gemini API in Google AI Studio, it evolves beyond simple request-response interactions. Key elements include server-side state management to handle conversation histories, background execution for long-running tasks, and support for tools like Model Context Protocol (MCP) servers. This API unifies model and agent interactions, making it easier to build agentic applications that involve thinking, tool calls, and complex histories.</p>
<p>Together, these tools highlight Google's push toward agentic AI, where agents not only process information but also create intuitive interfaces and manage intricate workflows. Recent developments, such as integrations with the Agent Development Kit (ADK) and expansions to more built-in agents, underscore their rapid evolution.</p>
<h3 id="heading-key-features-of-a2ui">Key Features of A2UI</h3>
<p>A2UI's design focuses on LLM-friendliness with a flat, streaming JSON format, framework-agnostic rendering (e.g., via Angular or Flutter), and custom components like interactive charts or maps.</p>
<h3 id="heading-key-features-of-interactions-api">Key Features of Interactions API</h3>
<p>It offers interpretable data models for debugging, optional server-side caching to reduce costs, and composability for manipulating agent histories.</p>
<h2 id="heading-how-to-implement-them-with-practical-examples-and-use-cases">How to Implement Them with Practical Examples and Use Cases</h2>
<p>Implementing A2UI starts with integrating its protocol into an AI agent workflow. Developers can clone the GitHub repository and use provided renderers for client-side implementation. For instance, in a web app built with React, you define a component catalog and parse incoming A2UI JSON messages to render elements like buttons or forms.</p>
<p>A practical example is the Landscape Architect Demo: Upload a photo to an agent powered by Gemini, which analyzes it and streams A2UI messages to generate an interactive UI with design suggestions, sliders for adjustments, and real-time previews. In code, this involves the agent generating a JSON structure like {"component": "slider", "props": {"min": 0, "max": 100}} and the client rendering it natively.</p>
<p>Use cases include enterprise applications where agents guide users through complex tasks, such as data visualization in business dashboards or interactive tutorials in educational tools. For mobile apps, Flutter integration allows the same A2UI descriptions to render on iOS and Android, ensuring consistency.</p>
<p>For the Interactions API, access begins with a Gemini API key from Google AI Studio. Use the /interactions endpoint in your code, specifying either a model or agent parameter.</p>
<p>In Python, a simple implementation for a model query might look like this:</p>
<pre><code class="lang-python"><span class="hljs-keyword">import</span> google.generativeai <span class="hljs-keyword">as</span> genai

client = genai.GenerativeModel(<span class="hljs-string">'gemini-3-pro-preview'</span>)
interaction = client.interactions.create(
    model=<span class="hljs-string">"gemini-3-pro-preview"</span>,
    input=<span class="hljs-string">"Who won the last Euro?"</span>,
    tools=[{<span class="hljs-string">"type"</span>: <span class="hljs-string">"google_search"</span>}]
)
print(interaction)
</code></pre>
<p>For agents, enable background execution for research tasks:</p>
<pre><code class="lang-python">interaction = client.interactions.create(
    agent=<span class="hljs-string">"deep-research-pro-preview-12-2025"</span>,
    input=<span class="hljs-string">"Research the history of Google TPUs."</span>,
    background=<span class="hljs-literal">True</span>
)
</code></pre>
<p>Poll for results later, ideal for apps handling long-horizon tasks without keeping connections open.</p>
<p>Use cases span developer tools, such as embedding Gemini Deep Research in custom apps for automated report generation on topics like quantum computing or market analysis. In production, it simplifies state management for chatbots or virtual assistants, reducing errors and costs through caching.</p>
<p>Combining both: Use Interactions API to orchestrate an agent that outputs A2UI messages, enabling end-to-end agentic UIs. For example, a travel planning app could research destinations via the API and render interactive maps via A2UI.</p>
<h2 id="heading-why-these-are-important">Why These Are Important</h2>
<p>A2UI and the Interactions API are pivotal because they bridge gaps in current AI ecosystems. A2UI enhances user experience by moving beyond text walls to interactive, secure UIs, fostering trust in agent-driven systems. This is crucial as AI agents proliferate in sensitive areas like healthcare or finance, where security and usability are paramount.</p>
<p>The Interactions API streamlines development for agentic apps, addressing limitations in stateless models by handling complexity on the server. With AI shifting toward autonomous agents capable of multi-step reasoning, this API lowers barriers for developers, potentially accelerating innovations like personalized education or advanced research tools.</p>
<p>In the broader AI landscape, these advancements promote open standards and collaboration, as seen in A2UI's community contributions and the API's integration with protocols like A2A. As of late 2025, with Gemini's expansions into products like Google Search and NotebookLM, they signal a future where AI agents become integral infrastructure, driving efficiency and creativity across industries.</p>
<p>Sources:</p>
<ul>
<li><p>A2UI Official Site: <a target="_blank" href="https://a2ui.org/">https://a2ui.org/</a></p>
</li>
<li><p>Google Interactions API Blog: <a target="_blank" href="https://blog.google/technology/developers/interactions-api/">https://blog.google/technology/developers/interactions-api/</a></p>
</li>
<li><p>A2UI GitHub Repository: <a target="_blank" href="https://github.com/google/A2UI">https://github.com/google/A2UI</a></p>
</li>
<li><p>Gemini API Documentation: <a target="_blank" href="https://ai.google.dev/gemini-api/docs/interactions">https://ai.google.dev/gemini-api/docs/interactions</a></p>
</li>
<li><p>Building Agents with ADK and Interactions API: <a target="_blank" href="https://developers.googleblog.com/building-agents-with-the-adk-and-the-new-interactions-api/">https://developers.googleblog.com/building-agents-with-the-adk-and-the-new-interactions-api/</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Day 1/100: The Future of Generative UIs: What to Expect in 2026]]></title><description><![CDATA[Intro to Generative Ui’s
Welcome to Day 1 of the #100DaysOfAi series! On this December 21, 2025, we're kicking off with one of the most exciting shifts in AI: Generative User Interfaces (Generative UIs). As 2025 draws to a close, we've seen massive l...]]></description><link>https://blog.karanbalaji.com/100-days-of-ai-day-1-generative-uis</link><guid isPermaLink="true">https://blog.karanbalaji.com/100-days-of-ai-day-1-generative-uis</guid><category><![CDATA[AI]]></category><category><![CDATA[UI]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Artificial Intelligence]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Mon, 22 Dec 2025 21:59:12 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1766439606142/109cfb69-3ccf-453d-ac10-50f040afd920.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-intro-to-generative-uis">Intro to Generative Ui’s</h1>
<p>Welcome to Day 1 of the #100DaysOfAi series! On this December 21, 2025, we're kicking off with one of the most exciting shifts in AI: Generative User Interfaces (Generative UIs). As 2025 draws to a close, we've seen massive leaps with Google's Gemini 3 rollout and emerging standards like the Model Context Protocol's UI extensions. This sets the stage for an explosive 2026 where AI doesn't just respond with text but builds entire interactive experiences on the fly. Let's dive into the current reality and what's coming next.</p>
<h2 id="heading-what-understanding-generative-uis">What: Understanding Generative UIs</h2>
<p>Generative UIs mark a fundamental evolution where large language models (LLMs) generate not only content but complete, interactive user interfaces tailored to a user's prompt or context. Traditional UIs are static, predefined layouts coded by developers. In contrast, Generative UIs are dynamic: AI analyzes intent, fetches data via tools, and synthesizes custom elements like buttons, charts, simulations, or full apps in real time.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1766440569744/4f1f8432-d83c-495f-a003-5e129eabfd9b.png" alt class="image--center mx-auto" /></p>
<p>This capability exploded in late 2025 with Google's Gemini 3, which powers "Dynamic View" in the Gemini app and AI Mode in Search. For any query, Gemini 3 creates bespoke interfaces, such as interactive loan calculators or physics simulations. Frameworks like Vercel's AI SDK enable this by linking tool calls (e.g., data retrieval) to React components for rendering. Emerging protocols, including extensions to the Model Context Protocol (MCP), allow secure embedding of rich UIs via iframes, supporting bidirectional agent-user interactions.</p>
<p><a target="_blank" href="https://blog.modelcontextprotocol.io/posts/2025-11-21-mcp-apps/">MCP UI APPS</a> Example:</p>
<p><img src="https://blog.modelcontextprotocol.io/posts/images/fullscreen-chat-app.png" alt="Example of a fullscreen app with a rich data table interface" /></p>
<p>At its heart, Generative UI turns AI into an on-demand designer and developer, moving us from fixed apps to intent-driven, adaptive experiences.</p>
<h2 id="heading-how-practical-use-cases-and-current-examples">How: Practical Use Cases and Current Examples</h2>
<p>Generative UIs are already live in 2025 products, built via tool-calling workflows where LLMs decide on actions and render outputs.</p>
<ol>
<li><p><strong>Core Implementation</strong>: Developers define tools (functions for tasks like weather lookup) and map results to UI components. In Vercel's AI SDK, a prompt triggers tool execution, streaming results to custom React elements like weather cards.</p>
</li>
<li><p><strong>Real-World Examples Today</strong>:</p>
<ul>
<li><p><strong>Google Gemini App and Search</strong>: Ask "Compare mortgage options," and Gemini 3 generates an interactive calculator with sliders for rates and terms. For education, prompts like "Explain RNA polymerase" yield custom simulations with controls.</p>
</li>
<li><p><strong>Data Visualization</strong>: In AI Mode, queries produce tailored charts, grids, or maps, outperforming static responses.</p>
</li>
<li><p><strong>Personalized Tools</strong>: Gemini creates custom event planners or learning games, adapting complexity (e.g., simple for kids, detailed for experts).</p>
</li>
<li><p><strong>Developer Tools</strong>: Google's A2UI protocol (launched December 2025) standardizes agent-generated interfaces, integrable with frameworks like React or Flutter for enterprise workflows.</p>
</li>
<li><p><strong>Chat Enhancements</strong>: Using AI SDK patterns, apps render stock tickers, calendars, or forms directly in conversations.</p>
</li>
</ul>
</li>
</ol>
<p>These examples show Generative UIs making interactions more intuitive, reducing text overload with visuals and controls.</p>
<h2 id="heading-why-the-importance-and-outlook-for-2026">Why: The Importance and Outlook for 2026</h2>
<p>Generative UIs are pivotal because they solve the limitations of text-heavy AI, delivering personalized, efficient experiences that feel truly native. Human evaluations show strong preferences for these dynamic interfaces over standard outputs, boosting engagement and accessibility.</p>
<p>Their importance lies in:</p>
<ul>
<li><p><strong>Hyper-Personalization</strong>: Interfaces adapt to user expertise, device, or context, making tech inclusive.</p>
</li>
<li><p><strong>Developer Efficiency</strong>: Shift focus from coding fixed layouts to defining intents and tools, accelerating app creation.</p>
</li>
<li><p><strong>Agentic Future</strong>: With protocols like A2UI and MCP extensions, AI agents will embed rich UIs seamlessly, powering multi-agent systems.</p>
</li>
</ul>
<p>Looking to 2026: Expect widespread adoption, with automatic UI generation (no manual toggles), deeper integration across apps, and multimodal enhancements (voice, AR). Enterprises will deploy agent-driven dashboards, while consumers get "just-in-time" apps replacing static ones. Challenges like speed and consistency will improve, making Generative UIs the default for AI interactions.</p>
<p>This technology isn't optional—it's the bridge to an adaptive, human-centric digital world.</p>
<p><strong>Sources</strong></p>
<ul>
<li><p><a target="_blank" href="https://research.google/blog/generative-ui-a-rich-custom-visual-interactive-user-experience-for-any-prompt/">Google Research: Generative UI Blog</a></p>
</li>
<li><p><a target="_blank" href="https://ai-sdk.dev/docs/ai-sdk-ui/generative-user-interfaces#generative-user-interfaces">Vercel AI SDK Documentation: Generative User Interfaces</a></p>
</li>
<li><p><a target="_blank" href="https://9to5google.com/2025/11/25/gemini-generative-uis-apps/">9to5Google: Gemini Generative UIs</a></p>
</li>
<li><p><a target="_blank" href="https://developers.googleblog.com/introducing-a2ui-an-open-project-for-agent-driven-interfaces/">Google Developers Blog: Introducing A2UI</a></p>
</li>
<li><p><a target="_blank" href="https://www.nngroup.com/articles/generative-ui/">NN/g: Generative UI and Outcome-Oriented Design</a></p>
</li>
<li><p><a target="_blank" href="https://blog.modelcontextprotocol.io/posts/2025-11-21-mcp-apps/">Model Context Protocol: MCP Apps Extension</a></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Day 40/100: The Future of Customers in the AI Revolution: Communities Will Be Key]]></title><description><![CDATA[What is Changing?
As AI continues to evolve, it’s making the process of creating products more accessible than ever. Today, with just an idea and the right tools, anyone can innovate and bring something to life. However, this newfound simplicity is s...]]></description><link>https://blog.karanbalaji.com/day-40-future-of-customers-in-the-ai-revolution-communities-will-be-key</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-40-future-of-customers-in-the-ai-revolution-communities-will-be-key</guid><category><![CDATA[Web Development]]></category><category><![CDATA[Design]]></category><category><![CDATA[UX]]></category><category><![CDATA[ux design]]></category><category><![CDATA[AI]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 12 Dec 2024 20:07:19 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1734034574382/e1431826-725c-42f4-917e-da89513f730e.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-what-is-changing">What is Changing?</h1>
<p>As AI continues to evolve, it’s making the process of creating products more accessible than ever. Today, with just an idea and the right tools, anyone can innovate and bring something to life. However, this newfound simplicity is shifting the challenge from <strong>how to create</strong> to <strong>how to acquire and retain customers</strong>. With the market flooded with similar offerings, standing out becomes the ultimate test—and <strong>communities</strong> might be the answer.</p>
<hr />
<h2 id="heading-why-communities-matter-decentralizing-monopolies-and-power">Why Communities Matter: Decentralizing Monopolies and Power</h2>
<h3 id="heading-trust-drives-consumer-choices">Trust Drives Consumer Choices</h3>
<p>One of the most influential factors in consumer decision-making is <strong>trust</strong>. People don’t just buy products; they buy into the individuals and brands behind them. Communities foster this trust through shared values, experiences, and open dialogue. When customers feel connected to a creator or brand, they’re more likely to support it—and stay loyal.</p>
<p>Take the example of <strong>potato chips</strong>:<br />Imagine choosing between five nearly identical brands in a supermarket. Now imagine a friend or community member introduces a healthier, equally tasty, and slightly innovative option. You’d likely pick their product—not just because it’s better, but because you trust the person behind it. And if the product falls short, you’d feel empowered to give honest feedback, knowing it would be acted upon.</p>
<hr />
<h3 id="heading-faster-feedback-loops">Faster Feedback Loops</h3>
<p>Communities enable creators to receive direct, real-time feedback, unlike traditional businesses relying on slow surveys or impersonal market research. This creates a <strong>fast feedback loop</strong>, allowing products to be continuously refined.</p>
<p>For instance, if your friend’s potato chips were too salty, you could immediately let them know. They might tweak the recipe based on your input, improving the product while strengthening your trust in their responsiveness.</p>
<hr />
<h3 id="heading-emotional-differentiation">Emotional Differentiation</h3>
<p>In an age of commoditization, <strong>products alone won’t set businesses apart</strong>. Emotional and social connections created through communities will be the true differentiators. Platforms like <strong>Patreon</strong>, <strong>Substack</strong>, and <strong>Heylo</strong> already show how community-driven models flourish. Consumers are willing to pay a premium for products and services that align with their values and offer a deeper connection.</p>
<hr />
<h2 id="heading-applying-nash-equilibrium-to-communities">Applying Nash Equilibrium to Communities</h2>
<h3 id="heading-removing-abuse-and-power-imbalances">Removing Abuse and Power Imbalances</h3>
<p>The <strong>Nash Equilibrium</strong>, a principle from game theory, describes a state where no participant can benefit by unilaterally changing their strategy. This concept can apply to communities, ensuring fairness and reducing abuse.</p>
<h4 id="heading-case-study-midnight-runners">Case Study: Midnight Runners</h4>
<p>Two years ago, I joined <strong>Midnight Runners Toronto</strong> when it was a small group of 50 runners. Documenting its journey through Instagram reels created <strong>organic virality</strong>, growing its following to over 20,000. However, as it scaled, challenges emerged. Negative behaviors by certain leaders and members began to damage the community’s reputation.</p>
<h4 id="heading-the-hype-squad-effect">The Hype Squad Effect</h4>
<p>To counteract these dynamics, I formed the "Hype Squad," a group of 80+ members chosen not by status but by how they treated others. This diverse group—from top entrepreneurs to individuals known for kindness—helped shift the culture. We hosted potlucks, karaoke nights, and genuine celebrations, prioritizing <strong>shared experiences</strong> over hierarchy.</p>
<hr />
<h2 id="heading-how-can-creators-leverage-communities">How Can Creators Leverage Communities?</h2>
<ul>
<li><p><strong>Start Small, Build Trust</strong>: Foster a tight-knit group of early adopters and engage authentically.</p>
</li>
<li><p><strong>Create a Feedback Loop</strong>: Act on feedback to show customers their voices matter.</p>
</li>
<li><p><strong>Highlight Shared Values</strong>: Build narratives around shared goals and missions.</p>
</li>
<li><p><strong>Measure Loyalty, Not Just Growth</strong>: Focus on retention and satisfaction over numbers.</p>
</li>
<li><p><strong>Use Tools Effectively</strong>: Leverage platforms like <strong>Heylo</strong>, <strong>Patreon</strong>, or <strong>Discord</strong> for engagement and tools like <strong>Hotjar</strong> and <strong>Google Analytics</strong> for product insights.</p>
</li>
</ul>
<hr />
<h2 id="heading-challenges-of-the-community-first-approach">Challenges of the Community-First Approach</h2>
<ul>
<li><p><strong>Scalability</strong>: Keeping intimacy alive as communities grow is difficult.</p>
</li>
<li><p><strong>Transparency</strong>: Higher accountability demands openness.</p>
</li>
<li><p><strong>Competition</strong>: Differentiating yourself in a community-driven market requires creativity.</p>
</li>
</ul>
<hr />
<h1 id="heading-thriving-in-the-age-of-digital-accountability">Thriving in the Age of Digital Accountability</h1>
<h3 id="heading-key-insight-for-creators-and-future-businesses">Key Insight for Creators and Future Businesses:</h3>
<blockquote>
<p>"The power of communities lies in trust, collaboration, and distribution—not monopolies."</p>
</blockquote>
<hr />
<h2 id="heading-the-role-of-ai-in-accountability">The Role of AI in Accountability</h2>
<p>AI acts as both a <strong>tool for creators</strong> and a <strong>watchdog for humanity</strong>. It ensures transparency by sifting through data to surface truths, making it harder for businesses to obscure their actions. This transparency, however, is an opportunity to leverage AI for continuous improvement and stronger relationships.</p>
<hr />
<h2 id="heading-conclusion-the-future-is-personal">Conclusion: The Future is Personal</h2>
<p>The future belongs to those who embrace community-driven models. By prioritizing <strong>trust, collaboration, and shared ownership</strong>, businesses can thrive—not by dominating markets, but by creating ecosystems where everyone succeeds together.</p>
<hr />
<h1 id="heading-references">References</h1>
<ul>
<li><p>DataReportal. (2024). <em>Digital Around the World</em>. Retrieved from<a target="_blank" href="https://datareportal.com/global-digital-overview?utm_source=chatgpt.com">Datareportal</a></p>
</li>
<li><p>Oberlo. (2024). <em>How Many People Use the Internet in 2024?</em>. Retrieved from<a target="_blank" href="https://www.oberlo.com/statistics/how-many-people-use-internet?utm_source=chatgpt.com">Oberlo</a></p>
</li>
<li><p>Statista. (2024). <em>Internet and social media users in the world 2024</em>. Retrieved from<a target="_blank" href="https://www.statista.com/statistics/617136/digital-population-worldwide/?utm_source=chatgpt.com">Statista</a></p>
</li>
<li><p>Sutskever, I. (2024). <em>Data Is the 'Fossil Fuel' of A.I.</em>. Retrieved from<a target="_blank" href="https://observer.com/2024/12/openai-cofounder-ilya-sutskever-ai-data-peak/?utm_source=chatgpt.com">Observer</a></p>
</li>
<li><p>The Verge. (2024). <em>OpenAI cofounder Ilya Sutskever says the way AI is built is about to change</em>. Retrieved from<a target="_blank" href="https://www.theverge.com/2024/12/13/24320811/what-ilya-sutskever-sees-openai-model-data-training?utm_source=chatgpt.com">The Verge</a></p>
</li>
</ul>
<h1 id="heading-how-to-cite-this-article">How to Cite This Article</h1>
<p>Balaji, Karan. (2024, December 21). <em>Future of Customers in the AI Revolution: Communities Will Be Key</em>. Retrieved from <a target="_blank" href="https://blog.karanbalaji.com/day-40-future-of-customers-in-the-ai-revolution-communities-will-be-key">https://blog.karanbalaji.com/day-40-future-of-customers-in-the-ai-revolution-communities-will-be-key</a></p>
]]></content:encoded></item><item><title><![CDATA[Day 39 of 100 Days of Design: Guest Karan Balaji Speaking at University of Toronto on Experimentation in Human-Computer Interaction]]></title><description><![CDATA[Introduction: Guest Speaking at UoFT
Recently, I had the privilege of guest-speaking at the University of Toronto, invited by Joseph Jay Williams, to discuss a core component of human-computer interaction (HCI): experimentation and A/B testing. This ...]]></description><link>https://blog.karanbalaji.com/day-39-of-100-days-of-design-guest-karan-balaji-speaking-at-university-of-toronto-on-experimentation-in-human-computer-interaction</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-39-of-100-days-of-design-guest-karan-balaji-speaking-at-university-of-toronto-on-experimentation-in-human-computer-interaction</guid><category><![CDATA[Design]]></category><category><![CDATA[UX]]></category><category><![CDATA[AI]]></category><category><![CDATA[Data Science]]></category><category><![CDATA[software development]]></category><category><![CDATA[Frontend Development]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Tue, 29 Oct 2024 18:23:10 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1730226152979/a08b6192-6167-447e-8d79-0002dfbadae9.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-introduction-guest-speaking-at-uoft">Introduction: Guest Speaking at UoFT</h1>
<p>Recently, I had the privilege of guest-speaking at the University of Toronto, invited by Joseph Jay Williams, to discuss a core component of human-computer interaction (HCI): experimentation and A/B testing. This talk allowed me to share real-world insights from my design journey, focusing on how experimentation drives user-centered, data-driven products. The core principle of design is "Put yourself into the customer's shoes and see from every angle." When I A/B tested this principle itself—across how design could have been done with other tools and timelines—it revealed itself as a fundamental truth. A/B testing is crucial in design because, so far, only a few designers build from one subject domain perspective due to limited resources. However, in the AI revolution, we will increasingly use generative UI and even generative UX strategies to test from multiple angles, creating more flexible and holistic design solutions. Here’s an outline of the session, structured to give students a deeper understanding of design experimentation.</p>
<h2 id="heading-what-the-role-of-ab-testing-and-experimentation-in-hci">What: The Role of A/B Testing and Experimentation in HCI</h2>
<p>In the session, I emphasized that every design choice is, in essence, a hypothesis. No matter how well-reasoned, our assumptions need validation through data. This is where A/B testing, experimentation, and metrics come into play.</p>
<p>A significant takeaway for students was understanding that design is never “final.” Continuous improvement is possible only by testing, measuring, and iterating.</p>
<h2 id="heading-why-the-importance-of-testing-hypotheses">Why: The Importance of Testing Hypotheses</h2>
<p>The key message was that assumptions alone cannot drive successful design; only data and real-world testing can. By treating each design choice as a hypothesis, we can make evidence-based improvements rather than relying solely on intuition.</p>
<p>For example, in one of my previous startups, I noticed that about 60% of users were accessing the platform on mobile. I used this insight to test a hypothesis based on two core psychological principles—Fitts's Law and Hick's Law.</p>
<p><a target="_blank" href="https://lawsofux.com/"><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1730222994236/49eddc1f-0ef2-424e-aa6f-1f3c06f23b03.png" alt class="image--center mx-auto" /></a></p>
<h3 id="heading-applying-fittss-and-hicks-laws">Applying Fitts’s and Hick’s Laws</h3>
<ul>
<li><p><strong>Fitts’s Law</strong>: This law posits that the time to reach a target depends on its distance and size, which is crucial for mobile design where users often rely on their thumbs for navigation.</p>
</li>
<li><p><strong>Hick’s Law</strong>: Hick's Law explains that having more choices can slow down decision-making. Simplifying the experience by optimizing button placement can make navigation more intuitive.</p>
</li>
</ul>
<p><a target="_blank" href="https://www.smashingmagazine.com/2016/09/the-thumb-zone-designing-for-mobile-users/"><img src="https://archive.smashing.media/assets/344dbf88-fdf9-42bb-adb4-46f01eedd629/496f7bc0-4c6c-4159-b731-ec3adcf91105/thumb-zone-mapping-opt.png" alt="thumb-zone-mapping-opt.png (957×618)" /></a></p>
<p>Source: Smashing Magazine</p>
<p>Hypothesizing that a bottom navigation bar could make key calls-to-action (CTAs) more accessible, I conducted an experiment. With this change, conversions increased by over 20% within two weeks, validating the hypothesis and demonstrating the power of data-driven design.</p>
<h2 id="heading-how-building-and-testing-multiple-hypotheses">How: Building and Testing Multiple Hypotheses</h2>
<p>I walked students through actionable steps on developing and testing hypotheses:</p>
<ol>
<li><p><strong>Identify Hypotheses</strong>: Every design change should begin with a clear hypothesis. For instance, “Placing CTAs within thumb reach on mobile will increase engagement.”</p>
</li>
<li><p><strong>Set Metrics for Success</strong>: Define success metrics, whether it’s increased conversions, engagement, or specific click-through rates.</p>
</li>
<li><p><strong>Gather Stakeholder Input</strong>: I involve stakeholders, such as the CMO, CTO, support teams, and customers, to gather varied perspectives on potential pain points. These insights often inspire additional hypotheses.</p>
</li>
<li><p><strong>Implement and Measure Variations</strong>: After gathering input, I create multiple design variations and test them against the baseline to identify the best-performing version.</p>
</li>
</ol>
<h3 id="heading-tools-of-the-trade-research-design-and-project-management">Tools of the Trade: Research, Design, and Project Management</h3>
<p>To track, analyze, and bring hypotheses to life, I rely on several tools, each fulfilling a unique role in the design and experimentation workflow:</p>
<ul>
<li><p><strong>Google Analytics and Google Tag Manager</strong>: These tools allow for custom tracking, letting me evaluate detailed engagement metrics for each design element.</p>
</li>
<li><p><strong>Hotjar</strong>: For in-depth behavioral research, Hotjar provides user session recordings, heatmaps, and insights like rage clicks, showing where users encounter fr<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">ustr</a>ation or drop off. This enable<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">s me</a> to address issues beyond raw metrics, focusing on actual user behavior and interaction.</p>
</li>
<li><p><strong>Figma</strong>: Figma is central to developing design hypotheses<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">, es</a>pecially the visual components. It allows me to experiment visually, crafting multiple iterations before finalizing designs for testing.</p>
</li>
<li><p><a target="_blank" href="http://V0.dev"><strong>V0.dev</strong></a>: A re<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">cent</a> addition to my workflow, <a target="_blank" href="http://V0.dev">V0.dev</a> is a prompt-based tool for generating multiple UI designs. This is fantastic for brainstorming, as it helps me quickly explore different UI ideas and variations, making the process of conceptualizing and iterating more efficient.</p>
</li>
<li><p><strong>GitHub Issues fo</strong><a target="_blank" href="https://github.com/users/karanbalaji/projects/2"><strong>r Pr</strong></a><strong>oject Management</strong>: For managing tasks and tracking progress, I use <strong>GitHub’s Issues feature</strong>. I create tickets for each design and coding task, organizing them into a structured <strong>Kanban board</strong>. You ca<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">n ch</a>eck out my active board <a target="_blank" href="https://github.com/users/karanbalaji/projects/2">here</a>, where every project phase, from hypothesis to testing, is meticulously documented. This setup is invaluable for transparent project management, allowing me to keep experiments organized and collaborate effectively.</p>
</li>
</ul>
<p>Together, these tools help me <a target="_blank" href="https://github.com/users/karanbalaji/projects/2">cond</a>uct research, design iteratively, and manage projects in a way that is outcome-driven and systematically refined based on data.</p>
<h2 id="heading-inspiring-resource-designing-like-a-scientist">Inspiring Resource: Designing Like a Scientist</h2>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://www.youtube.com/watch?v=XRd6Ddn4ZSY&amp;t=480s&amp;ab_channel=awwwards.">https://www.youtube.com/watch?v=XRd6Ddn4ZSY&amp;t=480s&amp;ab_channel=awwwards.</a></div>
<p> </p>
<p>In line with the talk’s themes, I explained the concept to students linked to Navin Iyengar's <strong>“Design Like a Scientist”</strong> talk from Netflix, recorded on August 10, 2018. Navin explains how Netflix uses outcome-based testing to drive design decisions by experimenting with multiple variat<a target="_blank" href="https://github.com/users/karanbalaji/projects/2">ions</a>. This approach is similar to what I call <strong>conversion-driven design</strong> and aligns with a broader concept called <strong>outcome-driven design</strong>.</p>
<p><a target="_blank" href="https://www.youtube.com/watch?v=XRd6Ddn4ZSY&amp;t=480s&amp;ab_channel=awwwards."><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1730223150815/578f0cc0-0fc1-48ef-b720-80fcbc1b9bc6.png" alt class="image--center mx-auto" /></a></p>
<p>Even Nielsen Norman Group recently noted on March 22, 2024, that the future of design will blend <a target="_blank" href="https://www.nngroup.com/articles/generative-ui/"><strong>generative UI</strong> with outcome-driven principles</a>. For my A/B testing and experimentation, I often use tools like <strong>VWO</strong> and previously <strong>Google Optimize</strong> (which has since been sunsetted), which make it easy to implement, measure, and refine designs based on real user data.</p>
<p><a target="_blank" href="https://www.nngroup.com/articles/generative-ui/"><img src="https://media.nngroup.com/media/editor/2024/03/21/genui-personalized-interface-for-each-user-nng.png" alt="Today = the same interface for everyone. Future with GenUI = Personalized interface for each user" /></a></p>
<p>This concept of “design like a scientist” resonated well with the students, inspiring them to adopt a similar experimental mindset.</p>
<h2 id="heading-question-from-guoxiang-zhao-collaboration-after-handoff">Question from Guoxiang Zhao: Collaboration After Handoff:</h2>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtu.be/_R6qo-jTWBI?si=L-ccYUm4YuOUq55X">https://youtu.be/_R6qo-jTWBI?si=L-ccYUm4YuOUq55X</a></div>
<p> </p>
<p>One of the most engaging moments was when a student, Guoxiang Zhao, asked, <em>“What do you do once your design work is handed off to the front-end developers? Do you continue refining the UI with them, or move on to the next project?”</em></p>
<p>My answer emphasized the iterative nature of design. Just as we treat initial design choices as hypotheses, we also validate the implemented design. I explained that I actively collaborate with the development team after handoff, measuring the design’s effectiveness and refining as needed. Experimentation is continuous, and success is proven only when real users interact with the design, with metrics confirming our goals.</p>
<h2 id="heading-real-world-example-resolving-stakeholder-feedback-through-experimentation">Real-World Example: Resolving Stakeholder Feedback through Experimentation</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1730223674402/320504ae-c5d3-488c-937a-f4663d0f54b8.png" alt class="image--center mx-auto" /></p>
<p>Finally, I discussed how experimentation helps navigate and validate stakeholder feedback. Often, team members—from executives to support staff—have diverse perspectives on design changes. By framing each idea as a hypothesis and testing it, I can systematically address everyone’s input, creating outcome-driven designs that satisfy both stakeholders and users.</p>
<h2 id="heading-conclusion-encouraging-experimentation-beyond-design">Conclusion: Encouraging Experimentation Beyond Design</h2>
<p>Speaking with U of T students was an inspiring experience. Their questions and enthusiasm showed a deep interest in exploring psychological and experimental perspectives in design. Through examples like applying Fitts’s and Hick’s Laws, I hope I illustrated that HCI is as much about understanding human behavior as it is about visual design. The students left with a newfound curiosity about testing from multiple angles, and I was reminded that experimentation is truly the key to effective, user-centered design.</p>
<p>Let’s keep iterating and experimenting!</p>
]]></content:encoded></item><item><title><![CDATA[Day 38/100: Non-Invasive Brain-Computer Interfaces: Unlocking the Power of Neural Oscillations By Ram K Pari]]></title><description><![CDATA[Introduction: A Breakthrough in Brain-Computer Interaction
In the ever-evolving field of brain-computer interfaces (BCIs), innovation is pushing the boundaries of what we can achieve. Ram Kumar Pari, a PhD candidate at Université Paul Sabatier Toulou...]]></description><link>https://blog.karanbalaji.com/day-38-non-invasive-brain-computer-interfaces-unlocking-the-power-of-neural-oscillations-by-ram-k-pari</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-38-non-invasive-brain-computer-interfaces-unlocking-the-power-of-neural-oscillations-by-ram-k-pari</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[AI]]></category><category><![CDATA[Developer]]></category><category><![CDATA[technology]]></category><category><![CDATA[tech ]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 17 Oct 2024 22:06:16 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1729201557778/e36fb535-36de-443a-b5b3-e7245a07bbec.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-introduction-a-breakthrough-in-brain-computer-interaction"><strong>Introduction: A Breakthrough in Brain-Computer Interaction</strong></h1>
<p>In the ever-evolving field of brain-computer interfaces (BCIs), innovation is pushing the boundaries of what we can achieve. <strong>Ram Kumar Pari</strong>, a PhD candidate at Université Paul Sabatier Toulouse, is at the forefront of this revolution. Ram is the first scientist to use <strong>kTMP</strong>, a new form of continuous magnetic stimulation, and the first to use it outside the company that developed it. His work, combined with the development of a <strong>new neurofeedback technique</strong>, is paving the way for a future where the brain directly interacts with machines in ways that were once the stuff of science fiction.</p>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXeQllV-QTTMkD0KqrxH3eMVQVjK8g2rfs95jIAI7MHWDiYSC26hnwJxvgbqi0klxKlIgAgQoOSVzAzN-t_WZnXCPWN2gMfYUs6Cj7edmiZ9Jn68CNUzgRRAXU_eBd_YQJ40GuN1UP7MK-MlYyI0lHOojB8?key=N6XyH0TkT_EsyIkko3yVGQ" alt /></p>
<h2 id="heading-what-unveiling-the-mysteries-of-neural-oscillations"><strong>What: Unveiling the Mysteries of Neural Oscillations</strong></h2>
<p>At the core of non-invasive brain-computer interfaces is the intricate world of <strong>neural oscillations</strong>—the rhythmic patterns that govern cognition, perception, and consciousness. These oscillations, which range from slow delta waves to fast gamma waves, synchronize different regions of the brain, enabling cognitive and sensory functions. Ram explains that one of the features of these oscillations, particularly in perception acts like a natural filter, helping the brain focus on relevant stimuli while ignoring distractions.</p>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXekKVcDwrF_8-03_tI0FPUnPcbhOiJupcvLYvACnj4FBCFyc_ifQLLL_cnQkfPMBmVIBUudePUddGlSiOR6qHCW0_LxjBzzzHYNU1bZ3ZWnxrj21xgVY5NxJGEhysbcFQC6u5ZpR8F6e8Du_gWQK7iDik4?key=N6XyH0TkT_EsyIkko3yVGQ" alt /></p>
<p>Ram's pioneering research focuses on decoding these oscillations through non-invasive methods like <strong>electroencephalography (EEG)</strong>, a technique that measures the brain's electrical activity. This year also marks the <strong>100-year anniversary of EEG</strong>, a landmark tool that has been indispensable in allowing researchers to peer into the brain’s inner workings without invasive techniques. The centennial of EEG reminds us of its lasting importance, even as new methods like kTMP push the field of modulating these oscillations into exciting new territories.</p>
<h2 id="heading-where-are-we-so-far"><strong>Where Are We So Far?</strong></h2>
<p>As we celebrate the <strong>100-year anniversary of EEG</strong>, the field of brain-computer interfaces (BCIs) has reached new heights. Ram Kumar Pari's use of <strong>kTMP</strong>, a novel form of magnetic stimulation, marks a significant advancement in how we interact with and modulate brain activity. Ram is the first scientist outside the original company to implement this technique in experiments, contributing to breakthroughs in neural oscillations research.</p>
<p>However, the <strong>price of high-quality EEG equipment</strong> remains a barrier. <strong>Research-grade devices</strong>, such as Ceegrids, can cost around <strong>$5000</strong> and require extensive research expertise to set up and use effectively. These high-end systems are crucial for fine-tuning brain oscillation measurements, but they are not clinical products. While <strong>commercial versions</strong> with <strong>dry electrodes</strong> are available at a lower price point, they often suffer from signal noise, limiting their effectiveness in precise applications.</p>
<p>The future holds promise, though, as technological innovation is driving down costs. Ram notes that future devices, which match the capabilities of today's <strong>research-grade equipment</strong>, could be available for significantly lower prices, possibly around <strong>$300</strong>. This drop in cost, combined with improved usability, could democratize access to EEG-based BCIs and bring them into mainstream use for both research and consumer applications.</p>
<h2 id="heading-why-decoding-the-brains-electrical-symphony"><strong>Why: Decoding the Brain's Electrical Symphony</strong></h2>
<p>At the core of <strong>non-invasive brain-computer interfaces (BCIs)</strong> lies the challenge of understanding the brain's complex electrical patterns. The brain’s oscillations, measured using <strong>electroencephalography (EEG)</strong>, provide a glimpse into how different regions of the brain communicate. However, as <strong>Ram</strong> explains, decoding these signals is far more complex than simply filtering out noise from sound.</p>
<p>The <strong>noise</strong> in EEG recordings is often <strong>high-dimensional</strong> and <strong>mathematically intricate</strong>, making it difficult to isolate relevant neural oscillations from the surrounding electrical activity. Ram notes that this complexity poses a significant hurdle for researchers, who must use advanced <strong>signal processing</strong> techniques to interpret the data. These techniques don't necessarily solve the noise issue but help study the <strong>modulation of brain oscillations</strong>, providing insights into how they can be influenced.</p>
<p>One existing method is <strong>transcranial alternating current stimulation (tACS)</strong>, which uses <strong>electric fields</strong> to modulate brain activity. tACS is well-established and offers a non-invasive way to influence neural oscillations. Although it doesn’t directly address the noise problem in EEG, it provides a way to explore how specific oscillations can be targeted and manipulated for therapeutic purposes.</p>
<h2 id="heading-what-will-the-future-be"><strong>What Will the Future Be?</strong></h2>
<p>The future of <strong>non-invasive brain-computer interfaces</strong> (BCIs) holds remarkable promise. <strong>Ram Kumar Pari</strong> foresees that devices designed to directly interface with the brain will revolutionize human-computer interaction within the next <strong>5 to 10 years</strong>. This shift, which Ram calls a transition from the <strong>computing revolution</strong> to the <strong>neural revolution</strong>, will dramatically lower the barriers to these technologies.</p>
<p>Currently, <strong>research-grade devices</strong>, such as <strong>Ceegrids</strong>, are expensive, priced around <strong>$5000</strong>, and require significant expertise to set up. These devices offer high precision for research but come with challenges like <strong>complex noise filtering</strong>. However, commercial solutions, such as <strong>dry electrode</strong> versions, already exist at lower price points but can suffer from <strong>noise issues</strong>. Ram anticipates that future iterations will bring the cost down significantly, potentially as low as <strong>$300</strong> for research-grade capabilities, making them accessible to a much broader audience without compromising quality.</p>
<p>In addition to cost, Ram reveals a major breakthrough: <strong>non-invasive techniques</strong> that could treat conditions like <strong>epilepsy</strong> before resorting to <strong>invasive surgery</strong> or exhausting <strong>drug options</strong>. Currently, invasive stimulation is the last resort after drugs fail, but with non-invasive BCIs, the treatment could begin <strong>much earlier</strong>, potentially offering the same level of efficacy as drugs—without the associated side effects. This is especially important for patients like <strong>pregnant women</strong>, for whom <strong>drug treatments</strong> can pose serious risks to the fetus. By avoiding harmful side effects, non-invasive solutions could make a profound difference in treating epilepsy safely.</p>
<p>This non-invasive approach also hints at a broader revolution in <strong>healthcare</strong>, with the potential to reduce reliance on drugs and surgery, shifting treatment towards <strong>brain modulation</strong> techniques that leverage neural oscillations to restore function or alleviate conditions.</p>
<h2 id="heading-how-bridging-the-gap-external-and-internal-approaches"><strong>How: Bridging the Gap – External and Internal Approaches</strong></h2>
<p>Researchers are taking two complementary approaches to overcome current challenges in brain-computer interfaces:</p>
<ul>
<li><p><strong>External stimulation</strong>: Methods like <strong>kTMP</strong> and <strong>tACS</strong> use alternating electromagnetic fields to influence neural oscillations, allowing researchers to modulate specific brain functions. This approach has the potential to enhance cognitive functions and treat neurological disorders in a non-invasive way.</p>
</li>
<li><p><strong>Internal modulation</strong>: Techniques like <strong>neurofeedback</strong> empower individuals to consciously control their brainwaves. Through feedback and training, users can fine-tune their neural oscillations, improving cognitive performance and auditory attention in noisy scenarios.</p>
</li>
</ul>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXeD94_4gTGWSoaHNofMGGWCokFSArIx0MHXhDC8UUwSAp3nf2d5HX_aK6N2cH4nKAH871-ap7bonLdeyr6ryIMdDc_n15fkxAFHNW2x9ux4Y0AtMyHufWl9nVw88B5Mfz_gZI9_qOgWGlt7eun68JPETp5G?key=N6XyH0TkT_EsyIkko3yVGQ" alt /></p>
<h3 id="heading-practical-applications-transforming-lives"><strong>Practical Applications: Transforming Lives</strong></h3>
<p><strong>Non-invasive brain-computer interfaces</strong> (BCIs) offer groundbreaking potential across various domains, enhancing human abilities and addressing cognitive challenges in ways that could transform lives. Ram Kumar Pari highlights key areas where this technology could have a significant impact, including <strong>auditory attention</strong>, <strong>language learning</strong>, and more intuitive human-device interaction.</p>
<h4 id="heading-auditory-attention-and-the-cocktail-party-effect"><strong>Auditory Attention and the "Cocktail Party Effect"</strong></h4>
<p>One of the most compelling applications is addressing auditory attention issues in elderly adults. Ram explains that while older individuals may not suffer from hearing loss per se, they often struggle with what’s known as the <strong>"cocktail party effect"</strong>. This refers to the brain's ability to filter out irrelevant sounds in noisy environments, like focusing on a single conversation in a crowded room. The difficulty comes from altered neural oscillations that impair the ability to process relevant auditory information. Hearing aids don’t address this problem espcially when the physical hearing capabilities aren’t as affected. It offers a solution to enhance auditory attention by synchronizing neural oscillations with specific auditory stimuli.</p>
<p>This technology of Neurofeedback, in general has been available for the past 20 years, but Ram emphasizes that the combination of advanced <strong>BCI systems</strong> with standard hearing aids offers a promising avenue for improving auditory attention in the elderly, helping them better filter and focus on relevant sounds and also to age-related hearing loss.</p>
<h4 id="heading-language-learning-and-cognitive-enhancement"><strong>Language Learning and Cognitive Enhancement</strong></h4>
<p>In addition to auditory applications, non-invasive BCIs could enhance <strong>language learning</strong> and overall <strong>cognitive processing</strong>. By tapping into the brain's natural neural oscillations, these interfaces can potentially accelerate language acquisition and optimize cognitive functions in healthy individuals. For example, aligning neural oscillations with specific language inputs could improve how quickly and efficiently learners grasp new languages, opening up new pathways for global communication and exchange.</p>
<h4 id="heading-accessibility-direct-interaction-with-devices-from-the-brain"><strong>Accessibility: Direct Interaction with Devices from the Brain</strong></h4>
<p>Looking further into the future, Ram envisions a world where individuals with disabilities or cognitive challenges can interact directly with digital devices using only their brain activity. Instead of relying on traditional interfaces like keyboards or touch screens, these users could command their devices by modulating <strong>neural oscillations</strong>, effectively bridging the gap between mind and machine. This development could vastly improve <strong>accessibility</strong> and <strong>independence</strong> for individuals with motor impairments or other conditions that limit physical interaction with technology.</p>
<p>Through these applications, <strong>non-invasive BCIs</strong> hold the potential to reshape how we engage with both the physical and digital worlds, offering new tools to enhance cognition, learning, and accessibility.</p>
<h2 id="heading-towards-a-future-of-seamless-interaction"><strong>Towards a Future of Seamless Interaction</strong></h2>
<p>As foundational research progresses, Ram emphasizes the importance of pushing the field towards <strong>practical applications</strong> and <strong>consumer products</strong>. The modular and flexible nature of brain-computer interfaces allows for cost-effective development, which could lead to the creation of affordable consumer devices in the near future.</p>
<p>The implications of this technology extend beyond healthcare. BCIs have the potential to revolutionize industries like <strong>pharmaceuticals</strong>, where drug-free treatments for neurological disorders become a reality. In this future, the boundaries between human cognition and digital interfaces blur, enabling a new era of seamless interaction between the brain and the digital world.</p>
<p><img src="https://lh7-rt.googleusercontent.com/docsz/AD_4nXduaTfZCqJn1zRxn47wpIeu33jABwGUuzSnNY38XQst5PN7a5mnlKCF8fXveRx7Ocp731mRYtwF23n4juPcqf4iAFMdYHqkvTrUlnp7xdXRA6ZGVjySqvt6SKKJr6VWt-ruA-9xlAtFSkiJk22z_0A431ql?key=N6XyH0TkT_EsyIkko3yVGQ" alt /></p>
<h2 id="heading-conclusion-from-the-brain-to-the-world"><strong>Conclusion: From the Brain to the World</strong></h2>
<p>Ram Kumar Pari's groundbreaking research into non-invasive brain-computer interfaces, particularly his work with <strong>kTMP</strong> and new <strong>neurofeedback techniques</strong>, is unlocking the potential of neural oscillations to transform the way we interact with technology. As the field evolves, the possibilities for direct brain-to-device interactions grow ever closer, heralding a future where the brain seamlessly bridges the gap between human cognition and digital environments.</p>
]]></content:encoded></item><item><title><![CDATA[Day 37/100:  Design Thinking to AI Thinking by Alipta Ballav]]></title><description><![CDATA[What: Redefining UX: Embracing AI and Design Thinking
Alipta Ballav, Senior Design Manager at Microsoft, presented a thought-provoking shift in the UX design landscape during the BeMore Festival. The conversation centered on moving beyond the traditi...]]></description><link>https://blog.karanbalaji.com/day-37-design-thinking-to-ai-thinking-by-alipta-ballav</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-37-design-thinking-to-ai-thinking-by-alipta-ballav</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[AI]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Wed, 02 Oct 2024 19:18:13 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1727896778148/f6da152c-3521-43a3-b719-d7d2740b4f0e.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-what-redefining-ux-embracing-ai-and-design-thinking">What: Redefining UX: Embracing AI and Design Thinking</h1>
<p>Alipta Ballav, Senior Design Manager at Microsoft, presented a thought-provoking shift in the UX design landscape during the BeMore Festival. The conversation centered on moving beyond the traditional "design thinking" methodology and adopting an AI-infused mindset that enhances how designers approach problems.</p>
<h2 id="heading-the-foundations-of-design-thinking">The Foundations of Design Thinking</h2>
<p>Design thinking has been a cornerstone in the world of UX design for years, focusing on empathy, defining problems, ideating solutions, prototyping, and testing. This user-centered process has been key in creating meaningful and innovative experiences for users.</p>
<p><img src="https://media.nngroup.com/media/editor/2016/07/25/designthinking_illustration_final2-02.png" alt="Design Thinking 101" /></p>
<h2 id="heading-expanding-the-designers-toolkit-with-ai-thinking">Expanding the Designer's Toolkit with AI Thinking</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727890872324/8dabacc3-f352-479c-8748-7a2b1841616d.png" alt class="image--center mx-auto" /></p>
<p>Ballav believes the future of design requires an expanded perspective. This new mindset, "AI thinking," involves looking at problems through a computational lens. By leveraging knowledge bases, understanding cultural nuances, and dealing with unstructured data, AI thinking allows designers to solve complex challenges with greater precision. This shift opens up new avenues for innovation, evolving beyond the confines of traditional design thinking.</p>
<hr />
<h1 id="heading-why-the-need-for-a-polymath-mindset-in-ux-design">Why: The Need for a Polymath Mindset in UX Design</h1>
<p>To stay relevant in a rapidly evolving industry, designers need to adopt a polymath mindset. Ballav draws inspiration from historical figures like Leonardo da Vinci, who is widely regarded as a polymath due to his expertise across multiple disciplines like civil engineering, chemistry, biology, and architecture. Similarly, the future of design will require professionals who can connect the dots between diverse fields like AI, economics, social sciences, business, psychology, and UX.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893053644/50dbaae7-6b97-4ca3-9c4f-6f41994610e4.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-reference-to-leonardo-da-vinci">Reference to Leonardo da Vinci</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893020241/1961465f-4d03-4204-83c0-fdde9b5389ed.png" alt class="image--center mx-auto" /></p>
<p>Ballav cited da Vinci as an example of the polymath ideal—a "comb set profile" where individuals possess deep expertise in multiple areas. This cross-disciplinary knowledge allows designers to approach problems from different angles and create more holistic solutions. He also referenced Don Norman’s views, who during his visit to India, mentioned that designers should not only focus on their craft but also consider economics, politics, and other subjects to better understand the world. Even Jakob Nielson has a similar take that the future of design will be more of a generalist, here is a recap article that I had written <a target="_blank" href="https://blog.karanbalaji.com/day-27-navigating-the-future-of-ux-specialist-vs-generalist-insights-with-jakob-nielsen-sarah-gibbons">https://blog.karanbalaji.com/day-27-navigating-the-future-of-ux-specialist-vs-generalist-insights-with-jakob-nielsen-sarah-gibbons</a></p>
<h2 id="heading-zooming-out-for-broader-context">Zooming Out for Broader Context</h2>
<p>"Design is more about zooming out and less about zooming in," Ballav said, stressing that UX designers need to broaden their understanding of society, culture, and technology. The ability to connect the dots across different disciplines will enable designers to craft solutions that are not only innovative but also practical and rooted in real-world dynamics. In my previous article, Don Norman has a similar take which can be read here: <a target="_blank" href="https://blog.karanbalaji.com/day-31-don-norman-on-designing-beyond-aesthetics-embracing-a-humanitarian-centric-and-generalist-future">https://blog.karanbalaji.com/day-31-don-norman-on-designing-beyond-aesthetics-embracing-a-humanitarian-centric-and-generalist-future</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893164381/a1d6a71d-66a4-47f0-b546-032f82da511e.png" alt class="image--center mx-auto" /></p>
<hr />
<h1 id="heading-how-applying-ai-thinking-in-ux-design">How: Applying AI Thinking in UX Design</h1>
<p>Ballav provided real-world applications of how AI thinking can complement design thinking, particularly in improving hospital patient experiences. The screenshot below explains how it is done through design thinking lens.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893657101/e60a1cfc-df4e-4bf4-a483-93048fc52341.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-through-ai-thinking-lens">Through AI Thinking Lens:</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893738156/dafcf889-41a8-489d-abff-801c6fb2d906.png" alt class="image--center mx-auto" /></p>
<h3 id="heading-data-collection-and-analysis">Data Collection and Analysis</h3>
<p>AI thinking starts with leveraging data from various sources like sensors, electronic health records, and user feedback. This data-driven approach provides a more comprehensive understanding of the problem that traditional research methods might overlook.</p>
<h3 id="heading-predictive-analytics">Predictive Analytics</h3>
<p>By analyzing this data, designers can predict high-traffic periods in a hospital and suggest optimal staffing levels. This anticipation reduces wait times, optimizing the patient experience.</p>
<h3 id="heading-automation-and-chatbots">Automation and Chatbots</h3>
<p>AI can streamline administrative tasks, such as scheduling and billing, by automating processes and deploying chatbots for common queries. This frees up medical staff to focus on personalized care, improving overall patient satisfaction.</p>
<h3 id="heading-enhanced-patient-monitoring">Enhanced Patient Monitoring</h3>
<p>AI-powered systems can monitor patients in real-time and alert staff to any critical changes, preventing complications and ensuring timely interventions.</p>
<hr />
<h1 id="heading-why-ai-and-design-thinking-work-together">Why AI and Design Thinking Work Together</h1>
<p>Ballav stresses that AI thinking doesn’t replace design thinking but acts as a catalyst, enabling designers to harness data-driven insights alongside their human-centered expertise. The symbiotic relationship between design and AI allows for more comprehensive problem-solving.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1727893961692/ef4852c3-5b50-4c5e-aa9a-c03d12147adf.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-the-cultural-nuances-of-common-sense">The Cultural Nuances of Common Sense</h2>
<p>A key aspect of AI thinking is understanding cultural nuances and reasoning. Ballav gave an example of crossing the road to illustrate this. In countries like India, crossing the road doesn’t follow the same logic as in Western cultures, where people wait for signals. Instead, people actively navigate traffic by making cars stop—a behavior deeply rooted in cultural practice. This variance highlights the challenge of teaching AI to understand and respect cultural differences.</p>
<p>This example reminded me of Don Norman’s concept of cultural adaptation in design. He once shared that his kitchen design, while simple for him, would pose a challenge for anyone unfamiliar with it. Similarly, AI thinking requires designers to consider cultural reasoning when developing systems that cater to diverse backgrounds. This ability to understand multiple perspectives will help UX professionals create more personalized and inclusive experiences.</p>
<h2 id="heading-the-future-of-ux-design">The Future of UX Design</h2>
<p>As the world becomes increasingly digitized, the intersection of AI and design will be crucial in developing user-centric products. Designers who embrace this AI-infused mindset will lead the charge in creating innovative solutions for the future.</p>
<p>By adopting AI thinking, designers can expand their capabilities and adapt to the evolving needs of users in the digital age.</p>
]]></content:encoded></item><item><title><![CDATA[Day 36/100: Minimum Lovable Product by Sunil Subramanian at BeMore Festival 2024]]></title><description><![CDATA[What: Building a Minimum Lovable Product (MLP)
During the BeMore Festival by ADPList, Sunil Subramanian shared his insights on creating a Minimum Lovable Product (MLP). While many product creators focus on building a Minimum Viable Product (MVP), Sun...]]></description><link>https://blog.karanbalaji.com/day-36-minimum-lovable-product-by-sunil-subramanian-at-bemore-festival-2024</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-36-minimum-lovable-product-by-sunil-subramanian-at-bemore-festival-2024</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Product Management]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 19 Sep 2024 13:52:13 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1726753503594/e2614453-d72f-45d9-a3dc-cb0e1dc02852.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-what-building-a-minimum-lovable-product-mlp"><strong>What: Building a Minimum Lovable Product (MLP)</strong></h2>
<p><strong>During the BeMore Festival by ADPList, Sunil Subramanian shared his insights on creating a Minimum Lovable Product (MLP). While many product creators focus on building a Minimum Viable Product (MVP), Sunil emphasizes the importance of not just functionality, but delight, engagement, and connection with users.</strong></p>
<h3 id="heading-moving-beyond-mvp-to-mlp"><strong>Moving Beyond MVP to MLP</strong></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1726753584108/704c87bc-1d07-4410-9f81-879dc3462f08.png" alt class="image--center mx-auto" /></p>
<p><strong>The concept of MVP centers around getting a product out the door with basic functionality. However, a Minimum Lovable Product goes beyond this by ensuring that the product is not only functional but also creates an emotional connection with users. An MLP stands out through simplicity, power, delight, and community.</strong></p>
<hr />
<h2 id="heading-how-the-key-elements-for-creating-an-mlp"><strong>How: The Key Elements for Creating an MLP</strong></h2>
<h3 id="heading-4-key-traits-for-an-mlp"><strong>4 Key Traits for an MLP:</strong></h3>
<ol>
<li><p><strong>Simple: The product must be clear, easy to understand, and complete in its core function. Complexity can drive users away, so simplicity is key.</strong></p>
</li>
<li><p><strong>Powerful: Like Apple's iPod tagline, "1,000 songs in your pocket," the product should provide a powerful, memorable message that resonates with users.</strong></p>
</li>
<li><p><strong>Delightful: First impressions matter. The product must be visually appealing and delightful upon first glance or use.</strong></p>
</li>
<li><p><strong>Tribal Effect: Like the excitement surrounding Apple product launches, the MLP should generate excitement among a community of users who love the product and eagerly anticipate its release.</strong></p>
</li>
</ol>
<h3 id="heading-metrics-for-measuring-success"><strong>Metrics for Measuring Success</strong></h3>
<p><strong>Sunil highlighted that the key metrics for a Minimum Lovable Product depend on the business stage:</strong></p>
<ul>
<li><p><strong>Explore Phase: At this stage, you’re looking for engagement metrics rather than conversion rates. Measure how users interact with your product before optimizing for other KPIs.</strong></p>
</li>
<li><p><strong>Expansion Phase: When expanding, look at Monthly Active Users (MAU) and Daily Active Users (DAU) to gauge the product's traction.</strong></p>
</li>
</ul>
<p><strong>Ultimately, engagement is often a more meaningful metric than conversion alone, as it reflects how well the product resonates with users.</strong></p>
<h3 id="heading-essential-traits-for-building-mlps"><strong>Essential Traits for Building MLPs</strong></h3>
<p><strong>Sunil outlined five essential traits to consider when building an MLP:</strong></p>
<ol>
<li><p><strong>You Can't Know What Customers Want: Start by understanding your users' "why" and focus on solving their problem—not just any problem.</strong></p>
</li>
<li><p><strong>Discovery Over Delivery: Stick to a two-week rule for product iterations. Separate the problem and solution space, and always aim for the product to be valuable, feasible, and usable.</strong></p>
</li>
<li><p><strong>Data Over Opinions: Data should guide product decisions. Never launch without analytics, but be data-informed, not data-driven, to maintain flexibility.</strong></p>
</li>
<li><p><strong>Pivots Over Plans: It’s important to iterate quickly, about 5-15 times per week if necessary. Vision pivots can be more powerful than simply sticking to rigid plans.</strong></p>
</li>
<li><p><strong>Culture Over Process: Validate ideas early, and aim for the fastest and cheapest ways to test hypotheses before running out of resources.</strong></p>
</li>
</ol>
<hr />
<h2 id="heading-why-key-takeaways-for-success"><strong>Why: Key Takeaways for Success</strong></h2>
<ol>
<li><p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1726753548340/903cdad8-2684-43a3-bdac-66f9c56843ee.png" alt class="image--center mx-auto" /></p>
<p> <strong>Be a Good Listener: Don't just listen to customers—listen to your team. Collaboration within your team can spark new solutions or insights.</strong></p>
</li>
<li><p><strong>Solve the Highest-Value Problem: Focus on how to incentivize customers effectively by addressing the most valuable pain points.</strong></p>
</li>
<li><p><strong>Validate Your Riskiest Assumptions: When you have 1,000 ideas, test the riskiest assumption first to mitigate failure and focus resources on what matters.</strong></p>
</li>
<li><p><strong>Differentiators Matter: When many products are similar, strong differentiators are crucial to stand out in the market.</strong></p>
</li>
<li><p><strong>Target the Right Customer Segment: Correctly segmenting your customer base is vital for ensuring that you are solving the right problems for the right people.</strong></p>
</li>
<li><p><strong>Identify Magic Moments: Look for moments in the customer journey that stand out—these can be leveraged to create loyalty and excitement around your product.</strong></p>
</li>
</ol>
<hr />
<h2 id="heading-my-qampa-what-are-the-key-metrics-for-minimum-lovable-product"><strong>My Q&amp;A: What are the key metrics for Minimum Lovable Product</strong></h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1726753764039/a15b093e-5baa-4d53-866f-4b9948e69bf6.png" alt class="image--center mx-auto" /></p>
<p>Sunil emphasized that the top metrics for an MLP can vary depending on the project, business model, and its stage of development. For instance, during the exploration phase or the expansion stage, different priorities may arise. However, one of the most universally important metrics is <strong><em>engagement</em></strong>, often considered more critical than just focusing on conversions. Other important ones are Monthly Active Users (MAUs), or Daily Active Users (DAUs).</p>
<h2 id="heading-conclusion-building-a-product-people-love"><strong>Conclusion: Building a Product People Love</strong></h2>
<p><strong>Building a Minimum Lovable Product requires more than just functionality; it involves creating a product that is simple, powerful, delightful, and loved by a community. By focusing on the right traits, listening to both your team and customers, and using data to drive decisions, you can create products that don't just work—but that people love.</strong></p>
]]></content:encoded></item><item><title><![CDATA[Day 35/100: Ethan Evan's 5 Magic Steps for Career Success]]></title><description><![CDATA[What: 5 Magic Steps for Career Success
Ethan Evans, a seasoned executive with a rich background, shared his life story and invaluable advice for the modern generation. In our fireside chat, we delved into his upbringing, life philosophy, and practica...]]></description><link>https://blog.karanbalaji.com/day-35-ethan-evans-5-magic-steps-for-career-success</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-35-ethan-evans-5-magic-steps-for-career-success</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Career]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Mon, 15 Jul 2024 17:35:12 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1721064034286/5f9083ef-329c-49c4-a10e-699574f46016.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3 id="heading-what-5-magic-steps-for-career-success">What: 5 Magic Steps for Career Success</h3>
<p>Ethan Evans, a seasoned executive with a rich background, shared his life story and invaluable advice for the modern generation. In our fireside chat, we delved into his upbringing, life philosophy, and practical steps for achieving career success.</p>
<hr />
<h3 id="heading-why-learning-from-ethan-evans-journey">Why: Learning from Ethan Evans' Journey</h3>
<p>Understanding the journey and insights of successful individuals like Ethan can provide guidance and inspiration. His emphasis on self-teaching, enjoying work, and strategic thinking offers a roadmap for personal and professional growth.</p>
<h4 id="heading-ethans-life-history">Ethan’s Life History</h4>
<p>Ethan Evans grew up in rural Ohio, in a region known for farming. Despite their advanced degrees (PhDs), Ethan's parents maintained a second job as farmers, which they enjoyed immensely. This blend of education and hands-on work ethic shaped Ethan's perspective on life and career.</p>
<h4 id="heading-life-advice-for-the-modern-generation">Life Advice for the Modern Generation</h4>
<p><strong>Self-Taught Programming:</strong><br />Ethan self-taught himself programming, embodying the belief that lifelong learning is crucial. He advocates for finding work you love so that dedication and focus become enjoyable rather than a chore. According to Ethan, working hard should not feel like a penalty if you are passionate about what you do.</p>
<h4 id="heading-what-are-executives">What Are Executives?</h4>
<p><strong>Focus and Priorities:</strong><br />Executives focus on strategy, determining what needs to be done and setting priorities. This contrasts with management, which is more concerned with the mechanics of administering a group of people.</p>
<p><strong>Common Management Mistakes:</strong><br />Ethan referenced Marshall Goldsmith's book "What Got You Here Won't Get You There" to highlight a common mistake: relying too heavily on strengths without adapting to new challenges.</p>
<h4 id="heading-other-general-talks">Other General Talks</h4>
<p><strong>Study What You See:</strong><br />Ethan emphasized the importance of observation and critical thinking. Watching a movie, for instance, is different from truly observing and analyzing it.</p>
<p><strong>Upgrade Your Surroundings:</strong><br />He advised upgrading the people you are around. If you think your manager is bad, learn how they got to their position and what you can learn from their journey.</p>
<hr />
<h3 id="heading-how-implementing-ethans-advice">How: Implementing Ethan's Advice</h3>
<p><strong>Self-Teaching and Passion:</strong><br />Identify areas you are passionate about and seek resources to self-teach and improve your skills.</p>
<p><strong>Strategic Thinking:</strong><br />Focus on the broader strategy and priorities in your role, rather than just the day-to-day mechanics.</p>
<p><strong>Observation and Learning:</strong><br />Cultivate critical thinking and observation skills to learn from your surroundings and improve your decision-making.</p>
<p><strong>Upgrading Your Network:</strong><br />Surround yourself with people who inspire and challenge you to grow.</p>
<p><strong>Practical Steps for Career Success:</strong><br />Ethan outlined five simple steps for career success:</p>
<ol>
<li><p><strong>Do Your Job Well:</strong> Ensure you are performing your current job to the best of your ability and identify areas for improvement.</p>
</li>
<li><p><strong>Offer Help:</strong> Ask your manager if they need assistance with anything.</p>
</li>
<li><p><strong>Take Initiative:</strong> Execute the tasks you take on diligently.</p>
</li>
<li><p><strong>Seek Opportunities:</strong> Inquire if there are tasks that align with your career goals.</p>
</li>
<li><p><strong>Repeat the Cycle:</strong> Continuously go through this cycle to advance your career.</p>
</li>
</ol>
<p>By integrating these principles, individuals can cultivate a growth mindset, achieve career success, and enjoy their work. Ethan's advice underscores the importance of passion, strategy, and continuous improvement in both personal and professional life.</p>
]]></content:encoded></item><item><title><![CDATA[Day 34/100: Embracing a Growth Mindset and Grit with Guy Kawasaki]]></title><description><![CDATA[Growth, Grit and Grace.
For Day 34 of my #100DaysOfDesign journey, I had the pleasure of diving into the insights and life lessons from Guy Kawasaki, a man who embodies the essence of a growth mindset and grit. Guy Kawasaki, known for his podcast "Re...]]></description><link>https://blog.karanbalaji.com/day-34-embracing-a-growth-mindset-and-grit-with-guy-kawasaki</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-34-embracing-a-growth-mindset-and-grit-with-guy-kawasaki</guid><category><![CDATA[Design]]></category><category><![CDATA[UX]]></category><category><![CDATA[Apple]]></category><category><![CDATA[Web Development]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Wed, 03 Jul 2024 23:16:05 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1720047344645/7f72d0be-7473-4dba-b02e-4ce3884a7d98.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3 id="heading-growth-grit-and-grace">Growth, Grit and Grace.</h3>
<p>For Day 34 of my #100DaysOfDesign journey, I had the pleasure of diving into the insights and life lessons from Guy Kawasaki, a man who embodies the essence of a growth mindset and grit. Guy Kawasaki, known for his podcast "<a target="_blank" href="https://podcasts.apple.com/us/podcast/guy-kawasakis-remarkable-people/id1483081827">Remarkable People</a>," shares his journey and philosophy that inspire us to think remarkable, embrace growth, and persist with grit.</p>
<hr />
<h4 id="heading-what-understanding-guy-kawasakis-journey">What: Understanding Guy Kawasaki's Journey</h4>
<p><strong>A Remarkable Journey</strong></p>
<p>Guy Kawasaki's life is a testament to overcoming challenges with a positive outlook. Despite living with Meniere's disease, which causes hearing loss and vertigo, Guy remains humble and thankful. He grew up in a lower-middle-class family, and his parents' emphasis on education significantly shaped his path. Dropping out of law school after just two weeks, he attributes his success to the strong educational foundation his parents provided.</p>
<p>Growing up in Hawaii, a place rich in racial and cultural diversity, also influenced his perspective on life. The diversity of his environment taught him to appreciate different viewpoints and fostered a broader understanding of the world.</p>
<p><strong>Connecting the Dots</strong></p>
<p>Reflecting on his journey, Guy often quotes Steve Jobs: "You can only connect the dots looking backwards." This perspective became clear when he saw his friends driving Porsches and Rolls Royces, motivating him to study hard and strive for success. His experience at Theranos taught him a crucial lesson: we must question authority and understand that wisdom and insight can come from anyone, regardless of their position or background.</p>
<p><strong>Growth Mindset and Grit</strong></p>
<p>Guy believes that while some people naturally possess a growth mindset, not everyone can develop it to the same extent. A growth mindset is about believing that with effort and determination, one can achieve almost anything. However, external factors and environments significantly influence this mindset.</p>
<p><strong>Cultivating a Growth Mindset Environment</strong></p>
<p>Guy emphasizes that the environment in which we are raised plays a crucial role in developing a growth mindset. In societies where failure is punished, and traditional career paths like government jobs or medicine are prioritized, individuals might feel restricted. Conversely, in environments that encourage entrepreneurship and innovation, people are more likely to adopt a growth mindset.</p>
<p><strong>Defining Grit</strong></p>
<p>Grit, according to Guy, is the ability to keep going when things are no longer easy or fun. It's about showing up consistently and persevering through challenges. Combining perseverance with grit and resilience can propel individuals to achieve their goals and continue growing, even in the face of adversity. Guy encapsulates this philosophy with the mantra: Growth, Grit, and Grace.</p>
<hr />
<h4 id="heading-why-the-importance-of-growth-mindset-and-grit">Why: The Importance of Growth Mindset and Grit</h4>
<p><strong>Building a Life or Career to be Proud Of</strong></p>
<p>For young individuals, Guy advises not to stress over making perfect decisions. Instead, he encourages making any decision and then working hard to make it the right one. Life is long, and there is ample time to pivot and grow from each experience.</p>
<p><strong>How to Be Remarkable: Insights from Canva</strong></p>
<p>Drawing inspiration from Guy's collaboration with Canva, the key to being remarkable involves embracing creativity, persistence, and a willingness to learn. The lessons shared in their collaborative efforts highlight the importance of continuously striving to improve and innovate.</p>
<hr />
<h4 id="heading-how-applying-guy-kawasakis-insights">How: Applying Guy Kawasaki's Insights</h4>
<p><strong>Practical Tips for Embracing a Growth Mindset and Grit</strong></p>
<ul>
<li><p><strong>Develop a Growth Mindset:</strong> Believe that with effort and determination, you can achieve almost anything. Surround yourself with environments and people that encourage entrepreneurship and innovation.</p>
</li>
<li><p><strong>Cultivate Grit:</strong> Keep going even when things are no longer easy or fun. Show up consistently and persevere through challenges. Remember Guy's mantra: Growth, Grit, and Grace.</p>
</li>
<li><p><strong>Make Decisions and Adapt:</strong> Don't stress over making perfect decisions. Make any decision and then work hard to make it the right one. Life is long, and there is plenty of time to grow from each experience.</p>
</li>
</ul>
<p><strong>Learning from Guy Kawasaki's Journey</strong></p>
<ul>
<li><p><strong>Embrace Diversity:</strong> Growing up in a diverse environment taught Guy to appreciate different viewpoints and fostered a broader understanding of the world.</p>
</li>
<li><p><strong>Question Authority:</strong> Guy's experience at Theranos taught him to question authority and understand that wisdom and insight can come from anyone, regardless of their position or background.</p>
</li>
</ul>
<p><strong>How to Be Remarkable: Insights from Canva</strong></p>
<ul>
<li><p><strong>Embrace Creativity and Persistence:</strong> Continuously strive to improve and innovate.</p>
</li>
<li><p><strong>Learn from Others:</strong> Draw inspiration from collaborations and the experiences of others.</p>
</li>
</ul>
<hr />
<h4 id="heading-final-thoughts">Final Thoughts</h4>
<p>Guy Kawasaki's insights on growth mindset and grit offer valuable lessons for anyone looking to build a fulfilling life or career. His journey teaches us that with the right mindset, environment, and perseverance, we can overcome challenges and achieve remarkable things.</p>
<p>To delve deeper into these lessons, check out the valuable insights shared by the Canva team on <a target="_blank" href="https://www.canva.com/learn/guy-kawasaki/#lessons">How to Be Remarkable by Guy Kawasaki</a>.</p>
]]></content:encoded></item><item><title><![CDATA[Day 33/100 : Enhancing Collaboration Between UX and Marketing with Aaron Yen]]></title><description><![CDATA[For day 33 of #100DaysOfDesign, I had the privilege to speak with Aaron Yen, Marketing Manager II at Amazon on the Alexa Canada team. Our conversation delved into the intricate dynamics between UX and marketing, offering valuable insights on how thes...]]></description><link>https://blog.karanbalaji.com/day-33-enhancing-collaboration-between-ux-and-marketing-with-aaron-yen</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-33-enhancing-collaboration-between-ux-and-marketing-with-aaron-yen</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[marketing]]></category><category><![CDATA[Web Development]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Wed, 12 Jun 2024 12:40:30 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1717988046709/25d98e96-8bdd-48f6-8a14-2a70d1288caa.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>For day 33 of #100DaysOfDesign, I had the privilege to speak with <a target="_blank" href="https://www.linkedin.com/in/aaronjyen/">Aaron Yen, Marketing Manager II at Amazon on the Alexa Canada team</a>. Our conversation delved into the intricate dynamics between UX and marketing, offering valuable insights on how these two crucial domains can collaborate effectively to drive successful projects. This article provides practical advice on how UX designers can work seamlessly with marketing teams from Aaron's experiences.</p>
<h3 id="heading-the-gap">The Gap:</h3>
<p><strong>What is the difference between UX and Marketing for a business strategy?</strong></p>
<p>UX and marketing, while seemingly at odds, are intrinsically linked. Marketing focuses on maximizing return on investment by aligning prospective customers with value, often through strategic and targeted efforts. On the other hand, UX is dedicated to creating seamless, enjoyable experiences that ensure continued product use. Without quality products, marketers cannot convert leads, and without interested leads, UX cannot serve a customer base. The synergy between these fields is essential for a holistic business strategy.</p>
<p><strong>Example</strong>: When Bell designed the touchtone telephone, they tested three designs, choosing the fastest one with the lowest error rate. This decision, driven by empirical data, set a new norm in telecommunications.</p>
<h4 id="heading-why-is-business-design-important">Why is Business Design Important?</h4>
<p>Aaron explains business design and UX design are crucial for creating products and services that not only delight users but also achieve business objectives. Business design focuses on integrating design thinking into strategic planning, ensuring solutions are economically viable and aligned with market needs. UX design, on the other hand, prioritizes the user's experience, making products intuitive and enjoyable through rigorous research and testing. The intersection of these disciplines ensures that user-centered designs also support business growth and innovation, leading to solutions that are both effective and sustainable. Together, they create a synergy that drives successful and impactful product development.</p>
<p><img src="https://www.rotman.utoronto.ca/-/media/Images/Programs-and-Areas/BDI/BD-Method---4-steps.jpg?h=265&amp;w=723&amp;la=en&amp;hash=0C0A0B6EE51704F404305B0ABA4A08A8066AA90F" alt="Click here to view a hi-res version" /></p>
<p><a target="_blank" href="https://www.rotman.utoronto.ca/FacultyAndResearch/EducationCentres/BusinessDesignInitiative">Source: Rotman</a></p>
<p><strong>How do UX and Marketing fit into Business Design for a company?</strong></p>
<p>Business design integrates product development into broader business contexts, iterating on revenue models, operations, strategies, and technologies. UX and marketing apply customer-centric focuses from different angles, both benefiting from prototyping and iterative testing. This approach allows companies to balance financial influx with necessary updates, driving innovation and growth.</p>
<p><strong>Example</strong>: Business design at the University of Toronto's Rotman School of Management involves applying product development principles to business operations, strategy, and technology, allowing for rapid adaptation and growth.</p>
<h3 id="heading-practical-collaboration-tips">Practical Collaboration Tips</h3>
<p><strong>What are some practical tips for UX designers to improve their collaboration with marketing teams? Are there specific tools or processes you recommend?</strong></p>
<ol>
<li><p><strong>Skill Development for UX Designers</strong>: UX designers should develop a T-shaped skillset, integrating marketing knowledge. Follow vocal industry professionals on LinkedIn for the latest trends and strategies. This approach helps UX designers understand the broader context of their work and how it aligns with marketing goals.</p>
</li>
<li><p><strong>Effective Communication and Collaboration</strong>: Strategies for effective communication include aligning on goals and metrics. Consult finance teams to prioritize key business metrics, then map these to pre-sale (marketing) and post-sale (UX) stages. Regular meetings and collaborative tools like Slack, Trello, or Asana can facilitate better communication and project management.</p>
</li>
<li><p><strong>Aligning Goals and Metrics</strong>: Use a northstar scorecard to highlight business priorities. For example, if brand awareness is a key metric, UX designers can create experiences that enhance engagement and reach. Aligning on a few key metrics helps both teams focus on what matters most for business success.</p>
</li>
<li><p><strong>User Research Integration</strong>: Integrate UX user research into marketing strategies to create a cohesive user journey. Understand customer appeal at each stage to build a complete business strategy. Sharing user research findings with marketing teams can inform campaign strategies and help tailor messages to customer needs.</p>
</li>
<li><p><strong>Maintaining Design Consistency</strong>: Ensure consistency in design and brand voice across UX and marketing efforts. UX teams can contribute by adhering to brand guidelines and creating experiences that align with the overall brand message. Regular reviews and collaborative workshops can ensure that both teams are on the same page.</p>
</li>
</ol>
<h3 id="heading-context-of-marketing-for-ux-roles">Context of Marketing for UX Roles</h3>
<p><strong>What are the levels of marketing from startup to mid-sized to enterprise?</strong></p>
<ul>
<li><p><strong>Startup Marketing</strong>: Startup marketing is narrow. It’s quick and focused towards generating outcomes from targeted customers, often with the intention to drive conversion, with an emphasis on UX driving the quality of uniquely differentiating or best-in-class product experiences.</p>
</li>
<li><p><strong>Mid-sized Marketing</strong>: Strategies from individual contributors and/or managers revolve around growth-oriented development and product trends. There is a balance of relationship building among customers through a sustained middle-funnel of consideration campaigns, where UX can be segmented into prime-focused features and premium offerings.</p>
</li>
<li><p><strong>Enterprise Marketing</strong>: Strategies are highly affected by reputation through partnerships, awareness to gain market share among large segments, and customer value that is value-based and revenue-driven. UX has a role here in applying experiences that support the brand, while changing aspects of customization that relate to bigger-picture influences, like thematic or sales calendars.</p>
</li>
</ul>
<h3 id="heading-skill-development-for-ux-designers">Skill Development for UX Designers</h3>
<p><strong>What skills or knowledge should UX designers develop to better understand and support marketing efforts? How can they stay informed about the latest marketing trends and strategies?</strong></p>
<p>Just like how marketers should be T-shaped in their skillset, where a deep foundation into a specific aspect should be supplemented with arms that broadly reach a variety of areas, UX designers can adopt marketing as one of the arms. My best tip for understanding the latest marketing trends is to target vocal LinkedIn profiles in industries and companies that are of interest. Just following marketing or product managers who love to showcase their work can turn LinkedIn into a faster news outlet than people can catch on, because posts come from the direct source - the people who built those things.</p>
<h4 id="heading-effective-communication-and-collaboration">Effective Communication and Collaboration</h4>
<p><strong>Can you share strategies/tips for ensuring effective communication and collaboration between UX and marketing teams? What are some examples of successful projects where these departments worked well together?</strong></p>
<p>Aligning goals and metrics between UX and marketing teams is crucial. By creating a simple, short-listed northstar scorecard that highlights the barebones impact of the business priorities, both teams can ensure they are working towards a common objective. Understanding the financial aspects of business, as well as customer engagement metrics, helps in creating a unified approach.</p>
<p><strong>Example</strong>: At Amazon, aligning the goals of the Alexa marketing and UX teams involved regular collaborative sessions to review customer feedback and marketing performance, ensuring a cohesive strategy that enhanced both user experience and market reach.</p>
<h4 id="heading-aligning-goals-and-metrics">Aligning Goals and Metrics</h4>
<p><strong>How can UX and marketing teams align their goals and success metrics to ensure they are working towards a common objective? What are some practical steps to achieve this alignment?</strong></p>
<p>The best “process” to reach an alignment is to consult finance teams on their KPIs for monetization and revenue-generating aspects of the business first. From there, prioritize 3-5 singular metrics that can describe business position and growth towards a financial perspective, and then ladder into pre-and-post sale (pre-sale for marketing, post-sale for UX).</p>
<h4 id="heading-user-research-integration">User Research Integration</h4>
<p><strong>How can user research conducted by UX teams be effectively integrated into marketing strategies? Could you share an example of how user insights have positively influenced a marketing campaign?</strong></p>
<p>Marketing and UX are at opposite ends of a customer journey spectrum. Marketing is about creating and generating leads before a customer purchases a product, and design is about keeping customers engaged with the experience of the product. Both are dependent on each other for a full user journey, and understanding what appeals to customers at each stage is crucial for building business.</p>
<p><strong>Example</strong>: At Amazon, user research insights about voice assistant preferences were integrated into Alexa's marketing strategies, highlighting features that resonated most with users, thereby increasing user engagement and sales.</p>
<h4 id="heading-maintaining-design-consistency">Maintaining Design Consistency</h4>
<p><strong>What are the best practices for maintaining design consistency and brand voice across UX and marketing efforts? How can UX teams contribute to ensuring a cohesive brand experience?</strong></p>
<p>Maintaining design consistency involves regular communication and collaboration between UX and marketing teams. UX teams can ensure a cohesive brand experience by adhering to brand guidelines and creating user interfaces that reflect the brand’s voice and values. Regular design reviews and collaborative workshops can help both teams stay aligned.</p>
<h3 id="heading-top-tips-for-ux-professionals">Top Tips for UX Professionals</h3>
<p><strong>What advice would you give to UX professionals to better understand and collaborate with marketing teams? How can UX designers play a significant role in developing and enhancing marketing strategies?</strong></p>
<ul>
<li><p><strong>Ask Good Questions</strong>: Engage deeply in the product development lifecycle to understand marketing needs.</p>
</li>
<li><p><strong>Stay Updated</strong>: Follow trends in marketing, such as AI-driven campaign customization, to find parallels with UX design.</p>
</li>
<li><p><strong>Align Goals</strong>: Work closely with marketing to align on key metrics and objectives.</p>
</li>
<li><p><strong>Collaborate Regularly</strong>: Hold regular meetings and workshops to ensure ongoing collaboration and alignment.</p>
</li>
</ul>
<h4 id="heading-marketing-focused-topics">Marketing-Focused Topics</h4>
<p><strong>What are the latest trends in the marketing world with regards to creative development that could be similar to UX?</strong></p>
<p>Similarly to UX design, marketing leverages AI to expedite and mass-release campaigns for products. Customizing experiences for target audiences via prompt engineering has allowed traditionally tedious aspects of marketing, like multicultural tailoring or localization, to shift into more strategic focuses for marketing, where managers now need to think less about executing and more on wider outreach or partnerships.</p>
<h4 id="heading-challenge">Challenge</h4>
<p><strong>Marketing and UX are at opposite ends of a customer journey spectrum. As a marketer who has not seen the product, they develop a funnel to highlight the value of the product. How do we take the budget or reputation we have and add as much value as we can to ensure a seamless customer experience?</strong></p>
<h3 id="heading-join-the-conversation">Join the Conversation</h3>
<p>Join our #100DaysOfDesign Discord channel to discuss more on this topic anonymously and share your events and perspectives to continuously learn: <a target="_blank" href="https://discord.gg/emkkRvTRTR">https://discord.gg/emkkRvTRTR</a>.</p>
]]></content:encoded></item><item><title><![CDATA[Day 32/100: Jakob Nielsen's 10 Foundational UX Insights]]></title><description><![CDATA[Why: Understanding the Importance of UX
Jakob Nielsen unveiled his 10 foundational UX insights, delving into the core principles that shape effective user experiences. By focusing on empirical data, the business value of UX, and the integration of AI...]]></description><link>https://blog.karanbalaji.com/day-32-jakob-nielsens-10-foundational-ux-insights</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-32-jakob-nielsens-10-foundational-ux-insights</guid><category><![CDATA[Design]]></category><category><![CDATA[UX]]></category><category><![CDATA[Web Development]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 16 May 2024 23:43:42 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1715902706904/940d52fe-c939-4570-9d79-d1c1682c59fc.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3 id="heading-why-understanding-the-importance-of-ux"><strong>Why: Understanding the Importance of UX</strong></h3>
<p>Jakob Nielsen unveiled his 10 foundational UX insights, delving into the core principles that shape effective user experiences. By focusing on empirical data, the business value of UX, and the integration of AI, Nielsen emphasizes the necessity of designing with the user's needs and behaviors in mind. His teachings offer a roadmap for designers to create intuitive, efficient, and engaging interfaces.</p>
<h3 id="heading-what-key-takeaways-from-jakob-nielsens-insights"><strong>What: Key Takeaways from Jakob Nielsen's Insights</strong></h3>
<ol>
<li><p><strong>Empirical Data from Representative Users</strong></p>
<ul>
<li><p><strong>Example</strong>: When Bell was designing the touchtone telephone, they tested three designs. They chose the fastest one, which became the norm, showing that empirical data can guide revolutionary decisions.</p>
</li>
<li><p><strong>Insight</strong>: Users may not always know what's best for them, but data-driven choices can lead to groundbreaking standards.</p>
</li>
</ul>
</li>
<li><p><strong>Business Value of UX</strong></p>
<ul>
<li><p><strong>Example</strong>: Improving the checkout process on an e-commerce site can lead to higher conversion rates and customer satisfaction.</p>
</li>
<li><p><strong>Insight</strong>: Investing in UX leads to higher customer satisfaction, better retention rates, and increased business value.</p>
</li>
</ul>
</li>
<li><p><strong>Discount Usability</strong></p>
<ul>
<li><p><strong>Example</strong>: Conducting quick, low-cost usability tests on a new app feature to gather feedback and make improvements.</p>
</li>
<li><p><strong>Insight</strong>: Fast and cheap usability testing leads to more iterations and ideas, improving overall quality.</p>
</li>
</ul>
</li>
<li><p><strong>Augmenting Human Intellect - Doug Engelbart</strong></p>
<ul>
<li><p><strong>Example</strong>: AI-enhanced writing tools boost productivity and output quality compared to non-AI tools.</p>
</li>
<li><p><strong>Insight</strong>: AI, when used synergistically, boosts human intellect and productivity.</p>
</li>
</ul>
</li>
<li><p><strong>GUI /WIMP</strong></p>
<ul>
<li><p><strong>Example</strong>: Combining traditional GUIs with voice-activated assistants for a more seamless user experience.</p>
</li>
<li><p><strong>Insight</strong>: While GUIs have dominated, AI advancements suggest a future with a blend of GUI and new interaction models.</p>
</li>
</ul>
</li>
<li><p><strong>Hypertext - Ted Nelson</strong></p>
<ul>
<li><p><strong>Example</strong>: Modern websites using hypertext to link related content, enhancing user navigation.</p>
</li>
<li><p><strong>Insight</strong>: The evolution of hypertext has shaped the way we interact with information, paving the way for modern web design.</p>
</li>
</ul>
</li>
<li><p><strong>Information Foraging</strong></p>
<ul>
<li><p><strong>Example</strong>: Designing website menus and links that help users find information quickly and efficiently, much like wolves hunting for food.</p>
</li>
<li><p><strong>Insight</strong>: Understanding this behavior helps in designing more intuitive navigation structures.</p>
</li>
</ul>
</li>
<li><p><strong>Irrational Users: Paradox of Active User - John M. Carroll</strong></p>
<ul>
<li><p><strong>Example</strong>: Designing software that is intuitive and easy to use without requiring users to read extensive manuals.</p>
</li>
<li><p><strong>Insight</strong>: Users rarely read manuals or watch demos. Designs must be intuitive enough to be grasped without prior knowledge.</p>
</li>
</ul>
</li>
<li><p><strong>SuperApps - WeChat</strong></p>
<ul>
<li><p><strong>Example</strong>: Integrating messaging, payments, and social media into a single app to streamline user experience.</p>
</li>
<li><p><strong>Insight</strong>: SuperApps like WeChat integrate multiple services, simplifying user experience and increasing engagement.</p>
</li>
</ul>
</li>
<li><p><strong>AI (Intent-Based Outcome Specification + Individualization)</strong></p>
<ul>
<li><p><strong>Example</strong>: Personalized news feeds that adapt to user preferences and behaviors.</p>
</li>
<li><p><strong>Insight</strong>: AI will enable personalized experiences by understanding user intent and providing tailored outcomes.</p>
</li>
</ul>
</li>
</ol>
<h3 id="heading-how-applying-nielsens-insights-to-future-ux-design"><strong>How: Applying Nielsen's Insights to Future UX Design</strong></h3>
<p>Nielsen's 10 foundational UX insights provide a comprehensive framework for designers. By leveraging empirical data, businesses can make informed decisions that revolutionize user experiences. Investing in UX drives higher customer satisfaction and business value. Quick, iterative usability testing enhances quality, while AI augments human intellect and productivity. Combining traditional GUIs with innovative interaction models ensures seamless user experiences. Understanding information foraging and designing intuitive navigation structures helps users find what they need efficiently. Embracing SuperApps and personalized AI-driven experiences paves the way for future UX design, making interfaces more engaging and user-friendly. These insights equip designers to create intuitive, efficient, and personalized user experiences, setting new standards in the industry.</p>
<h3 id="heading-my-question-to-jakob"><strong>My Question to Jakob</strong></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1715902851844/6eaf9c00-9c96-49f2-9577-c17f9309b6ad.png" alt class="image--center mx-auto" /></p>
<p>I had the opportunity to ask Jakob Nielsen a question:</p>
<p><strong>Q: What are the best ways to learn how humans function or make decisions?</strong></p>
<p><strong>A: Observing people is the best way and creating a global community to learn in a big group will help in learning from multiple perspectives.</strong></p>
<h3 id="heading-join-the-conversation"><strong>Join the Conversation</strong></h3>
<p>Join the #100DaysOfDesign challenge on Discord to share and learn from other designers. Discuss events, gain insights, and connect with peers to enhance your design journey. Find out about future events I'll be attending and let's chat live on Discord. <a target="_blank" href="https://discord.gg/sTsRXs3YuH"></a><a target="_blank" href="https://discord.gg/sTsRXs3YuH">Discord Invite</a></p>
]]></content:encoded></item><item><title><![CDATA[Day 31/100: Don Norman on Designing Beyond Aesthetics: Embracing a Humanitarian-Centric and Generalist Future]]></title><description><![CDATA[In a thought-provoking fireside chat, Don Norman, a luminary in the field of design and cognitive science, shares his evolved perspective on the role of design in addressing the world's most pressing challenges. Norman, renowned for his groundbreakin...]]></description><link>https://blog.karanbalaji.com/day-31-don-norman-on-designing-beyond-aesthetics-embracing-a-humanitarian-centric-and-generalist-future</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-31-don-norman-on-designing-beyond-aesthetics-embracing-a-humanitarian-centric-and-generalist-future</guid><category><![CDATA[Design]]></category><category><![CDATA[UX]]></category><category><![CDATA[AI]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Developer]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Mon, 18 Mar 2024 22:21:11 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1710800100037/6f4a6e43-3f61-47a8-a8b3-e5a04f8d197b.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In a thought-provoking fireside chat, Don Norman, a luminary in the field of design and cognitive science, shares his evolved perspective on the role of design in addressing the world's most pressing challenges. Norman, renowned for his groundbreaking work and influential books, offers profound insights into the transformative power of design in creating a better world.</p>
<p>#100DaysofDesign group: <a target="_blank" href="https://discord.gg/c44P8VNgTe">https://discord.gg/c44P8VNgTe</a></p>
<h2 id="heading-why-normans-shift-towards-a-global-mission"><strong>(Why) Norman's Shift Towards a Global Mission</strong></h2>
<p>Initially viewing feedback as a marker of error, Norman's journey led him to appreciate the broader implications of design. His retirement marked a turning point, as he observed the myriad crises plaguing the world—famine, wars, climate crisis. Determined to make a difference, Norman identified human behavior as the pivotal factor in these global challenges, prompting a deep dive into how design can effect real change.</p>
<h2 id="heading-what-the-evolving-role-of-design"><strong>(What) The Evolving Role of Design</strong></h2>
<h3 id="heading-the-philosophy-of-enjoyment-and-impact"><strong>The Philosophy of Enjoyment and Impact</strong></h3>
<ul>
<li><p><strong>Human Behavior:</strong> Central to changing major global problems.</p>
</li>
<li><p><strong>Enjoying Life:</strong> Balancing personal fulfillment with making a positive impact.</p>
</li>
</ul>
<h3 id="heading-designs-central-role-in-society"><strong>Design's Central Role in Society</strong></h3>
<ul>
<li><p><strong>A Generalist's Craft:</strong> Designers must understand a broad spectrum of disciplines, including politics, economics, and art, to truly innovate.</p>
</li>
<li><p><strong>Beyond Human-Centered Design:</strong> Advocating for a shift towards humanitarian-centric design, focusing not just on individual needs but on societal and environmental benefits.</p>
</li>
</ul>
<h3 id="heading-the-need-for-comprehensive-skills"><strong>The Need for Comprehensive Skills</strong></h3>
<ul>
<li><p><strong>Beyond Craft:</strong> Understanding the interconnectedness of business, societal benefit, and environmental sustainability.</p>
</li>
<li><p><strong>Global and Historical Awareness:</strong> Recognizing the importance of societal, political, and historical knowledge in creating impactful design.</p>
</li>
</ul>
<h2 id="heading-how-normans-vision-for-future-design"><strong>(How) Norman's Vision for Future Design</strong></h2>
<h3 id="heading-embracing-complexity-and-ethics-in-ai"><strong>Embracing Complexity and Ethics in AI</strong></h3>
<ul>
<li><p><strong>Shifting Focus:</strong> From human-centric to humanitarian-centric design.</p>
</li>
<li><p><strong>AI as a Partner:</strong> Optimistic about AI's role in improving lives, emphasizing the need for ethical considerations and human values in AI development.</p>
</li>
</ul>
<h3 id="heading-simplicity-versus-function"><strong>Simplicity Versus Function</strong></h3>
<ul>
<li><p><strong>Complexity and Perspective:</strong> Acknowledging that simplicity is subjective and depends on individual understanding and context.</p>
</li>
<li><p><strong>Design Learning Curve:</strong> The importance of creating designs that are intuitively understandable yet adaptable to the complexity of real-world applications.</p>
</li>
</ul>
<h3 id="heading-the-futurist-technologist"><strong>The Futurist Technologist</strong></h3>
<ul>
<li><p><strong>Generalist versus Specialist:</strong> Norman advocates for a generalist approach, emphasizing the importance of interdisciplinary learning and the inclusion of humanities in design thinking.</p>
</li>
<li><p><strong>Continuous Learning:</strong> The principle of lifelong learning as a cornerstone of innovation and societal improvement.</p>
</li>
</ul>
<h3 id="heading-designing-for-a-better-world"><strong>Designing for a Better World</strong></h3>
<p>Norman's call to action is clear: designers have a profound opportunity, and responsibility, to contribute to a better world. By adopting a holistic, interdisciplinary approach and focusing on humanitarian outcomes, designers can address the root causes of global challenges. Norman's vision extends beyond traditional design thinking, advocating for a world where design is a catalyst for positive change, balancing human needs with the health of our planet and society.</p>
<p>As we reflect on Norman's insights, it becomes evident that the future of design lies in our ability to adapt, learn, and apply our skills towards creating sustainable, ethical, and universally beneficial solutions. The path forward is not just about aesthetics or usability; it's about leveraging design as a powerful tool for social and environmental stewardship.</p>
<h2 id="heading-my-question-to-don-norman">My Question To Don Norman</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1710801910952/e70fafa0-6de3-465e-b2cd-b3d32b298c93.png" alt class="image--center mx-auto" /></p>
<p>In a thrilling exchange during the fireside chat with Don Norman, I had the opportunity to ask a question that delves deep into the intersection of design and other disciplines. My inquiry was, 'Considering coding, psychology, anthropology, cognitive neuroscience, and behavioral economics, which do you believe would most effectively complement my design skills, or is there another area you'd suggest?'</p>
<p>A huge thank you to everyone who voted for my question, which Don Norman elaborated on during his talk. He emphasized the importance of learning about people not only through disciplines like anthropology and psychology but also from novelists who observe and write about the simple, everyday aspects of human behaviour, such as why people scratch or dance.</p>
<p>This interaction was a highlight for me, spotlighting the importance of interdisciplinary knowledge in pushing the boundaries of design thinking.</p>
<h2 id="heading-meme-the-matrix-of-design">(Meme) The Matrix of Design</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1710800054627/b6d2d19d-5ce5-4006-b9dd-0659a736d203.png" alt class="image--center mx-auto" /></p>
<p>In a moment of profound realization, Don Norman gazes out his window and observes a world entirely crafted by human hands. This epiphany highlights the artificial nature of our surroundings, underscoring the pivotal role of design in shaping our reality. This reflection leads Norman to delve deeper into history, seeking the roots of design principles that have guided human innovation over centuries.</p>
<p>This journey of discovery echoes the iconic scene from "The Matrix," where the protagonist is offered a choice between the red pill and the blue pill—a decision between embracing the harsh truths of reality or remaining in the comfort of blissful ignorance. Norman's moment of realization is akin to choosing the red pill, embarking on a quest to understand the historical and artificial constructs of our world.</p>
<p>Through this lens, Norman posits that the essence of design lies not just in the creation of aesthetically pleasing or functional objects but in the profound understanding of humanity's history and artificial constructs. This perspective challenges us to think beyond the surface, urging designers to consider the broader implications of their work on society and the environment.</p>
<p>By weaving together the threads of history, human behavior, and design, Norman invites us to view our role as designers in a new light. We are not just creators of objects or experiences but architects of the future, with the power to shape the world in ways that resonate with the deepest aspects of human existence.</p>
<p>As we ponder Norman's insights, we are reminded of the power of design to transcend the artificial, to question the status quo, and to forge a path towards a more thoughtful, humanitarian-centric future. In this "Matrix" of design, every decision, every creation, has the potential to redefine our world and our place within it.</p>
<p>Join the conversation and explore how we can all contribute to a better world through design. #DesignForABetterWorld #HumanitarianDesign #DonNorman #FutureOfDesign</p>
]]></content:encoded></item><item><title><![CDATA[Day 30/100: Julie Zhou on the Art of Feedback]]></title><description><![CDATA[In the collaborative world of design and leadership, feedback often carries an ambiguous aura—sometimes welcome, often dreaded. Julie Zhou's Fireside Chat on "How to Give and Receive Feedback" offers a profound shift from the conventional wisdom that...]]></description><link>https://blog.karanbalaji.com/day-30-julie-zhou-on-the-art-of-feedback</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-30-julie-zhou-on-the-art-of-feedback</guid><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[technology]]></category><category><![CDATA[UX]]></category><category><![CDATA[webdev]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 07 Mar 2024 04:40:07 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1709786145686/02e64b8b-e341-4dbb-96fd-325baae5180a.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>In the collaborative world of design and leadership, feedback often carries an ambiguous aura—sometimes welcome, often dreaded. Julie Zhou's Fireside Chat on "How to Give and Receive Feedback" offers a profound shift from the conventional wisdom that surrounds this crucial interaction. Here's how Julie reshapes our understanding and practice of feedback into an art form that accelerates growth and fosters strong relationships. This is the <a target="_blank" href="https://www.youtube.com/watch?v=SWoZ5t5o24A">YouTube link</a> to the talk and the summary is below.</p>
<p><strong>Join #100DaysOfDesign Discord:</strong><a target="_blank" href="https://discord.gg/emkkRvTRTR"><strong>https://discord.gg/emkkRvTRTR</strong></a></p>
<h2 id="heading-why-reframing-feedback-as-a-gift"><strong>(Why) Reframing Feedback as a Gift</strong></h2>
<p>Julie Zhou confesses that she once viewed feedback as an indicator of error, a one-way street from the more experienced to the novices, where positive feedback was synonymous with good. However, she now champions feedback as a precious gift, a catalyst for personal and professional development. It's a tool for illumination, revealing the blind spots we all possess and motivating us to action that we can take pride in.</p>
<h2 id="heading-what-strategies-for-eliciting-and-offering-better-feedback"><strong>(What) Strategies for Eliciting and Offering Better Feedback</strong></h2>
<h3 id="heading-1-picking-your-level-of-feedback"><strong>1. Picking Your Level of Feedback</strong></h3>
<p>Julie introduces a tiered approach to feedback, suggesting we ask for or give:</p>
<ul>
<li><p><strong>Task-Specific Feedback:</strong> Focused on the intricacies of a particular task.</p>
</li>
<li><p><strong>Behavioral Feedback:</strong> Centered on the patterns of behavior and their impacts.</p>
</li>
<li><p><strong>360 Feedback:</strong> A comprehensive look from all angles, encompassing all aspects of performance and interaction.</p>
</li>
</ul>
<h3 id="heading-2-crafting-your-inquiries-and-statements"><strong>2. Crafting Your Inquiries and Statements</strong></h3>
<p>Julie provides a toolkit for soliciting feedback effectively:</p>
<ul>
<li><p><strong>For Task-Specific Feedback:</strong></p>
<ul>
<li><p>What aspects of my work shone brightly?</p>
</li>
<li><p>How could this work be further enhanced?</p>
</li>
</ul>
</li>
<li><p><strong>For Behavioral Feedback:</strong></p>
<ul>
<li><p>What strengths of mine do you believe should be emphasized?</p>
</li>
<li><p>Are there any behaviors or habits you feel are limiting my potential?</p>
</li>
</ul>
</li>
<li><p><strong>For Meta Feedback:</strong></p>
<ul>
<li>How can we establish a productive feedback exchange?</li>
</ul>
</li>
</ul>
<p>When giving feedback, Julie suggests using a structure that communicates the action, your response to it, and the reason why, followed by an optional suggestion for alternative actions. Most importantly, check if your feedback resonates with the recipient.</p>
<h3 id="heading-3-making-feedback-a-regular-practice"><strong>3. Making Feedback a Regular Practice</strong></h3>
<p>Creating a routine around feedback turns it into a habit:</p>
<ul>
<li><p><strong>For Giving Feedback:</strong></p>
<ul>
<li>Aim to offer feedback daily or whenever a certain emotion is triggered.</li>
</ul>
</li>
<li><p><strong>For Requesting Feedback:</strong></p>
<ul>
<li>Regularly ask for feedback, perhaps monthly or after completing a significant task.</li>
</ul>
</li>
</ul>
<h2 id="heading-how-navigating-the-landscape-of-feedback"><strong>(How) Navigating the Landscape of Feedback</strong></h2>
<h3 id="heading-receiving-critical-feedback"><strong>Receiving Critical Feedback</strong></h3>
<p>When faced with negative feedback, Julie advises always to respond with gratitude and maintain a positive demeanor, recognizing the emotional weight it may carry.</p>
<h3 id="heading-giving-tough-feedback"><strong>Giving Tough Feedback</strong></h3>
<p>When offering critical feedback, Julie emphasizes:</p>
<ul>
<li><p><strong>Checking Your Intentions:</strong> Ensure the feedback serves the recipient, not just your perspective.</p>
</li>
<li><p><strong>Expressing Your True Feelings:</strong> Be candid about the emotions the feedback process evokes in you.</p>
</li>
<li><p><strong>Inviting Their Perspective:</strong> Actively listen to their side of the story.</p>
</li>
<li><p><strong>Building Relationships:</strong> Remember, the best relationships are often forged through the fires of honest, challenging conversations.</p>
</li>
</ul>
<p>Feedback is not just about pointing out what's wrong—it's a powerful force for growth and learning. By adopting Julie Zhou's nuanced approach, we can transform feedback from a dreaded task into an opportunity for building understanding and strengthening connections. Let's reshape our work cultures to view feedback as the gift it truly is—a step towards where we want to be, faster and with more confidence. 🌟 #Leadership #DesignThinking #FeedbackIsAGift</p>
]]></content:encoded></item><item><title><![CDATA[Day 29/100: Vitaly Friedman's Strategic Mastery in Design KPIs, Design SOW, and Social Equity]]></title><description><![CDATA[Dive into the insightful world of Vitaly Friedman, the founder of smashingmagazine.com, as he shares his strategic mastery in design hosted by our rockstar Felix Lee from AdpList.com . His approach not only enhances visual appeal but also embeds soli...]]></description><link>https://blog.karanbalaji.com/day-29-vitaly-friedmans-strategic-mastery-in-design-kpis-design-sow-social-equity</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-29-vitaly-friedmans-strategic-mastery-in-design-kpis-design-sow-social-equity</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Developer]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 29 Feb 2024 01:45:14 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1709170228431/a82af4d3-0a5d-4230-a49b-3cfb66a33851.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Dive into the insightful world of Vitaly Friedman, the founder of <a target="_blank" href="http://smashingmagazine.com">smashingmagazine.com</a>, as he shares his strategic mastery in design hosted by our rockstar Felix Lee from <a target="_blank" href="http://adplist.org">AdpList.com</a> . His approach not only enhances visual appeal but also embeds solid strategies and genuine human connections into the fabric of UX design. This exploration offers fresh perspectives on achieving success in design frameworks.</p>
<p><strong>#100DaysOfDesign Discord:</strong><a target="_blank" href="https://discord.gg/emkkRvTRTR"><strong>https://discord.gg/emkkRvTRTR</strong></a></p>
<h2 id="heading-why-harnessing-social-equity-and-strategic-design"><strong>(Why) Harnessing Social Equity and Strategic Design</strong></h2>
<h3 id="heading-the-essence-of-social-equity-in-design"><strong>The Essence of Social Equity in Design</strong></h3>
<ul>
<li><p><strong>Investment in People:</strong> Vital importance of building strong relationships with colleagues, clients, and collaborators.</p>
</li>
<li><p><strong>Trust and Collaboration:</strong> The impact of trust-based connections on creativity and innovation.</p>
</li>
<li><p><strong>The Power of a Small Circle:</strong> A small group of passionate individuals can significantly influence success.</p>
</li>
</ul>
<h3 id="heading-strategic-design-management"><strong>Strategic Design Management</strong></h3>
<ul>
<li><p><strong>Beyond Creativity:</strong> The necessity of a strategic approach for navigating design project complexities.</p>
</li>
<li><p><strong>Ten-Point SOW Framework:</strong> Demonstrates the importance of detailed planning and clear alignment between clients and designers.</p>
</li>
</ul>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1709168332465/6cd87f1c-6b88-48d0-ab51-1e50ffd589b2.png" alt class="image--center mx-auto" /></p>
<h2 id="heading-what-vitaly-friedmans-blueprint-for-design-excellence"><strong>(What) Vitaly Friedman's Blueprint for Design Excellence</strong></h2>
<h3 id="heading-cultivating-a-human-centric-writing-approach"><strong>Cultivating a Human-Centric Writing Approach</strong></h3>
<ul>
<li><p><strong>Authentic Content:</strong> Advocacy for genuine, human-centric writing over trend-focused articles and AI-generated content.</p>
</li>
<li><p><strong>Regular Contribution:</strong> The benefits of committing to writing two articles per week to foster learning and growth.</p>
</li>
</ul>
<h3 id="heading-bridging-design-and-business-through-kpis"><strong>Bridging Design and Business Through KPIs</strong></h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1709170581392/ce007ce9-1102-43d0-bdb9-3fdbf97c072a.png" alt class="image--center mx-auto" /></p>
<p>Vitaly Friedman emphasizes the importance of strategically aligning design with business objectives to enhance the value of design within a business context. This strategic alignment not only demonstrates the tangible impact of design decisions but also ensures these decisions are measurable and directly contribute to business success. A prime example Friedman showcases to illustrate this concept involves a multi-tiered approach to KPIs, which bridges the gap between business metrics and design initiatives:</p>
<ul>
<li><ul>
<li><p><strong>Business KPI:</strong> Referral Rate - A key performance indicator that tracks the effectiveness of users referring others to the product.</p>
<p>    * <strong>Design KPI:</strong> Average Recommendation per User in a Month - This metric delves deeper into the user behavior, specifically focusing on how design influences the likelihood of recommendations.</p>
<p>    * <strong>Measure:</strong> Percentage of Users Who Have Referred at Least 1 Person to Our Products Over the Last 6 Months - This measure provides a clear, quantifiable insight into referral rates over a significant period, offering a direct link to the effectiveness of design strategies.</p>
<p>    * <strong>Design Initiative:</strong> Improve Branding Perception by Featuring Successful Case Studies - This actionable initiative aims to enhance the product's branding perception, thereby increasing referral rates. By showcasing successful case studies, the design directly contributes to a positive brand image, encouraging more users to make referrals.</p>
<p>    This example perfectly encapsulates how design KPIs can be meticulously mapped to business KPIs, ensuring that every design initiative is not only aligned with but actively contributes to the overarching business goals. Through such strategic alignment, design transcends its traditional boundaries, becoming a pivotal driver of business success</p>
</li>
</ul>
</li>
</ul>
<h2 id="heading-how-implementing-friedmans-visionary-strategies"><strong>(How) Implementing Friedman's Visionary Strategies</strong></h2>
<h3 id="heading-building-and-nurturing-relationships"><strong>Building and Nurturing Relationships</strong></h3>
<ul>
<li><p><strong>Proactive Relationship Building:</strong> Seeking opportunities for meaningful connections and being accessible for collaboration.</p>
</li>
<li><p><strong>Creating a Supportive Community:</strong> The importance of a dynamic and supportive design community.</p>
</li>
</ul>
<h3 id="heading-decoding-the-dialogue-the-contrast-between-business-and-ux-language-in-design-strategy">Decoding the Dialogue: The Contrast Between Business and UX Language in Design Strategy</h3>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1709170542417/60174a8e-0201-48da-91fe-225e2d10c54c.png" alt class="image--center mx-auto" /></p>
<p>In this insightful snapshot from a Fireside Chat with Vitaly Friedman, we're presented with a compelling comparison between Business language and UX language. The image highlights how the vocabulary we choose can significantly impact our approach to design and user experience. On one side, we see Business language with aggressive terms such as "Conquer," "Capture," and "Destroy," which reflect a combative and market-driven focus. On the flip side, UX language uses nurturing terms like "Reduce friction," "Empower users," and "Develop empathy," indicating a more user-centered approach that aims to create an inclusive and supportive customer experience. This contrast not only showcases the distinct mindsets of business versus user experience but also emphasizes the importance of aligning our communication with our design and business goals.</p>
<h3 id="heading-bonus-figma-embracing-a-strategic-design-framework"><strong>Bonus Figma: Embracing a Strategic Design Framework</strong></h3>
<ul>
<li><p><strong>Disciplined Planning and Execution:</strong> The need for meticulous planning, from crafting detailed SOWs to aligning design KPIs with business goals.</p>
</li>
<li><p><strong>Design Strategy Canvas:</strong> Utilizing tools like Friedman's design strategy canvas for visual roadmap and alignment with broader business strategies.</p>
</li>
</ul>
<p>Here is the Figma Link for <a target="_blank" href="https://www.figma.com/file/FUmHT27G8cwmFt7J4YTV5J/%E2%9B%B3-UX-Strategy-Canvas-(Copy)?type=whiteboard&amp;node-id=0-1&amp;t=VmP0jsA3G5076dq2-0">Strategic Design Framework</a> by Vitaly which is still WIP</p>
<p>Friedman's insights offer a comprehensive guide for navigating the design industry with a blend of aesthetics, usability, and strategic planning. By building relationships, adopting strategic frameworks, and aligning design with business outcomes, designers can create impactful work and drive meaningful change.</p>
<h2 id="heading-join-the-100daysofdesign-discord-channel">Join the <strong>#100daysofdesign Discord Channel</strong></h2>
<p><a target="_blank" href="https://discord.gg/emkkRvTRTR">Discord Link</a>. Share and learn from other designers about events or materials related to design they are attending or exploring. This platform provides a unique opportunity to discover future events I'll be attending and writing about. Together, we can attend these events and engage in live discussions on Discord, enriching our design journey with collective insights and experiences. Don't miss out on this chance to connect and grow with a vibrant community of designers!</p>
]]></content:encoded></item><item><title><![CDATA[Day 28/100: Exploring Design Insights from Unity Apple Vision Pro Conference]]></title><description><![CDATA[This article is about the Unity Apple Vision Pro conference, where the future of spatial design takes center stage. We're diving deep into the insights and challenges shared by luminaries in the field, uncovering how they're reshaping our digital int...]]></description><link>https://blog.karanbalaji.com/day-28-exploring-design-insights-from-unity-apple-vision-pro-conference</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-28-exploring-design-insights-from-unity-apple-vision-pro-conference</guid><category><![CDATA[UX]]></category><category><![CDATA[Design]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[vision pro]]></category><category><![CDATA[Apple]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Tue, 27 Feb 2024 23:28:31 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1709075557178/2972b382-3729-42b0-aa60-728b5b8bfc05.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This article is about the Unity Apple Vision Pro conference, where the future of spatial design takes center stage. We're diving deep into the insights and challenges shared by luminaries in the field, uncovering how they're reshaping our digital interactions.</p>
<p><strong>#100DaysOfDesign Discord:</strong><a target="_blank" href="https://discord.gg/emkkRvTRTR"><strong>https://discord.gg/emkkRvTRTR</strong></a></p>
<h2 id="heading-why-embracing-the-future-of-spatial-design"><strong>(Why) Embracing the Future of Spatial Design</strong></h2>
<p>Immerse in the dynamic potential of mixed reality as Andrew Eiche articulates its power to provide exploratory realms for users. Encounter the candid revelations from Stefan about the opaque challenges of transitioning UX from devices to Vision Pro—acknowledging a new, yet unrefined paradigm.</p>
<h2 id="heading-what-illuminating-insights-from-the-experts"><strong>(What) Illuminating Insights from the Experts</strong></h2>
<h3 id="heading-andrew-eiche-the-mixed-reality-advocate"><strong>Andrew Eiche: The Mixed Reality Advocate</strong></h3>
<ul>
<li><p><strong>Mixed Reality as an Exploration Realm:</strong> Eiche illuminates the immersive experiences offered by mixed reality, urging developers to harness this potential.</p>
</li>
<li><p><strong>Communication is Key:</strong> He challenges developers to engage actively with forums, asserting that collective problem-solving can expedite development processes.</p>
</li>
</ul>
<h3 id="heading-stefan-the-ux-pathfinder"><strong>Stefan: The UX Pathfinder</strong></h3>
<ul>
<li><p><strong>Navigating the Uncharted:</strong> Stefan shares the struggles of working 'in the dark' with new technologies, emphasizing the importance of streamlined UX in emerging paradigms.</p>
</li>
<li><p><strong>Human-Centric Interaction:</strong> He advocates for prioritizing native human interaction—hand and eye movements—as the cornerstone of exceptional UX.</p>
</li>
</ul>
<h3 id="heading-xander-mccarthy-the-pragmatic-developer"><strong>Xander McCarthy: The Pragmatic Developer</strong></h3>
<ul>
<li><strong>Risk and Resource Management:</strong> McCarthy's approach to app development focuses on prudent risk management and cautions against overcommitment in unfamiliar areas.</li>
</ul>
<h2 id="heading-how-navigating-the-evolution-of-design"><strong>(How) Navigating the Evolution of Design</strong></h2>
<p>As we chart the course of spatial design's evolution, let's embrace the insights from the Unity Apple Vision Pro conference. With a focus on innovation, collaboration, and the savvy use of emergent technologies, we forge ahead.</p>
<h3 id="heading-the-uxui-conundrum"><strong>The UX/UI Conundrum</strong></h3>
<p>Andrew Eiche praises Apple's implementation of the pinch gesture, advising designers to appreciate each interaction's uniqueness and avoid treating it as a mere substitute for grabbing. He also underscores the importance of ergonomic design in 3D spaces, considering the physicality of user interactions.</p>
<h3 id="heading-design-readings-and-resources"><strong>Design Readings and Resources</strong></h3>
<ul>
<li><p><strong>Ergonomics in Interaction:</strong> Explore the physicality of design, where touch, feel, grab, and pinch are not just functions but experiences that require thoughtful ergonomic consideration.</p>
</li>
<li><p><strong>Recommended Reading:</strong> Delve into "The Design of Everyday Things" for an in-depth understanding of usability and human-centered design principles.</p>
</li>
<li><p><strong>Exploring Unity's Tools:</strong> Eiche points to Unity's Interaction Framework and the XR Interaction Toolkit as resources to better grasp interaction design's nuances, despite challenges like non-customizable hover states due to privacy concerns.</p>
</li>
</ul>
<h3 id="heading-the-vision-of-spatial-computing"><strong>The Vision of Spatial Computing</strong></h3>
<p>Andrew traces the lineage of spatial computing, likening Apple's foray into the space as a first-generation device reminiscent of the Apple II—pioneering yet rudimentary. Apple's commitment to spatial design hints at a future where our entire environment becomes an interface, with hand, eye, pinch, and grab tracking heralding a new mainstream wave.</p>
<p>💡 As we continue on this #100DaysOfDesign journey, we're not just observers but active participants in the unfolding story of spatial design. We're here to learn, share, and innovate together. Stay connected, join our Discord community, and let's harness the collective genius to push the boundaries of what's possible in design. 🌐✨</p>
<p>#SpatialDesign #UXInnovation #UnityAppleVisionPro #FutureOfDesign #100DaysOfDesign</p>
]]></content:encoded></item><item><title><![CDATA[Day 27/100: Navigating the Future of UX: Specialist vs Generalist Insights with Jakob Nielsen & Sarah Gibbons]]></title><description><![CDATA[(Why) - Navigating the Evolution of UX
🔥 Today, I had the privilege of attending a fireside chat with Jakob Nielsen, a pioneer in the field of UX design. Jakob's optimistic outlook on the future of UX, coupled with his 40 years of experience, shed l...]]></description><link>https://blog.karanbalaji.com/day-27-navigating-the-future-of-ux-specialist-vs-generalist-insights-with-jakob-nielsen-sarah-gibbons</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-27-navigating-the-future-of-ux-specialist-vs-generalist-insights-with-jakob-nielsen-sarah-gibbons</guid><category><![CDATA[UX]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Design]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Fri, 23 Feb 2024 01:31:42 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1708651667679/65990011-a5e5-4ce2-99a8-b97c099fcff9.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-why-navigating-the-evolution-of-ux"><strong>(Why) - Navigating the Evolution of UX</strong></h2>
<p>🔥 Today, I had the privilege of attending a fireside chat with Jakob Nielsen, a pioneer in the field of UX design. Jakob's optimistic outlook on the future of UX, coupled with his 40 years of experience, shed light on the evolving landscape of our industry. From the emergence of AI tools to the proliferation of software with complex user interfaces, the future of UX is ripe with opportunities and challenges.</p>
<h2 id="heading-what-embracing-specialization-and-generalization"><strong>(What) - Embracing Specialization and Generalization</strong></h2>
<p>🌟 Jakob delved into the debate between being a UX specialist versus a generalist, highlighting the shifting dynamics in the industry. He emphasized the importance of embracing AI as a tool that enhances our capabilities and narrows skill gaps. While specialization was once revered, the trend is now towards a more diversified skill set, with AI acting as a catalyst for innovation.</p>
<h3 id="heading-the-value-of-generalization"><strong>The Value of Generalization:</strong></h3>
<p>Jakob underscored the benefits of being a UX generalist, from reducing communication overhead to enhancing credibility and broadening skill sets. In a world where AI is reshaping the design process, the ability to adapt and diversify one's skill set is paramount.</p>
<h3 id="heading-the-downside-of-specialization"><strong>The Downside of Specialization:</strong></h3>
<p>While specialization has its merits, Jakob cautioned against limiting oneself to a single skill set. As AI continues to evolve, the need for specialists may diminish, making it essential for designers to broaden their horizons and embrace a more holistic approach to UX design.</p>
<h2 id="heading-how-embracing-change-and-innovation"><strong>(How) - Embracing Change and Innovation</strong></h2>
<p>💡 Jakob's insights served as a reminder that UX design is a dynamic field, constantly evolving in response to technological advancements and changing user needs. As we navigate the future of UX, it's essential to embrace change, adapt to new tools and methodologies, and maintain a user-centric mindset.</p>
<p>In this ever-changing landscape, the key to success lies in our ability to embrace both specialization and generalization, leveraging AI as a tool to enhance our skills and deliver exceptional user experiences. Jakob's wisdom has inspired me to continue pushing the boundaries of UX design and embrace the opportunities that lie ahead.</p>
<p>Stay tuned for more insights from my journey into the future of UX! 🌐✨ #UXSpecialist #UXGeneralist #FutureOfUX #JakobNielsen #100DaysOfDesign</p>
]]></content:encoded></item><item><title><![CDATA[Day 26/100:  Insights from Julie Zhuo on Making Data-Driven vs Data-Informed Decisions]]></title><description><![CDATA[(Why) - Understanding Data Dynamics
✍️ Today, I had the privilege of attending an enlightening fireside chat by adplist.org with Julie Zhuo, former VP of Product Design at Facebook and co-founder of Sundial. Julie delved into the nuances between data...]]></description><link>https://blog.karanbalaji.com/day-26-insights-from-julie-zhuo-on-making-data-driven-vs-data-informed-decisions</link><guid isPermaLink="true">https://blog.karanbalaji.com/day-26-insights-from-julie-zhuo-on-making-data-driven-vs-data-informed-decisions</guid><category><![CDATA[UX]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Data Science]]></category><category><![CDATA[Design]]></category><dc:creator><![CDATA[Karan Balaji]]></dc:creator><pubDate>Thu, 15 Feb 2024 22:47:16 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1708036997311/a585eba2-2024-44a5-a2a6-e4e683fd48e1.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2 id="heading-why-understanding-data-dynamics"><strong>(Why) - Understanding Data Dynamics</strong></h2>
<p>✍️ Today, I had the privilege of attending an enlightening fireside chat by <a target="_blank" href="https://app.adplist.org/">adplist.org</a> with Julie Zhuo, former VP of Product Design at Facebook and co-founder of Sundial. Julie delved into the nuances between data-driven and data-informed decisions, emphasizing the importance of context and empathy in design.</p>
<h2 id="heading-what-unpacking-insights"><strong>(What) - Unpacking Insights</strong></h2>
<p>✍️ Julie's insights shed light on the role of data as a guiding force rather than a sole determinant in decision-making. She highlighted the need for a unified approach, where data serves as a tool for understanding and informing design choices, ultimately complementing the empathetic lens of designers.</p>
<h2 id="heading-how-embracing-ai-in-design"><strong>(How) - Embracing AI in Design</strong></h2>
<p>🛠️ Furthermore, Julie shared her vision for the future of design with AI, envisioning it as a transformative tool that streamlines manual tasks and amplifies human creativity. As Steve Jobs famously said, "It's a bicycle for our minds," AI promises to enhance our design processes, enabling us to focus on the impactful aspects of problem-solving.</p>
<p>In this dynamic landscape, where data and AI intersect with human ingenuity, the possibilities for innovation are boundless. Julie's insights serve as a compass, guiding us toward a future where design is not just about pixels and screens but about meaningful impact and problem-solving. Stay tuned for more insights from my journey in #100DaysOfDesign! 🚀 #DataDrivenDesign #AIInnovation #DesignLeadership</p>
]]></content:encoded></item></channel></rss>