Forging The Future with Chris Howard cover art

Forging The Future with Chris Howard

Forging The Future with Chris Howard

By: Chris Howard
Listen for free

About this listen

Join Chris Howard, Founder and CEO of Softeq, as he interviews knowledgeable leaders in the innovation spectrum, including CEOs, CTOs, R&D professionals, and start-up founders. Real conversations, technology, and processes of bringing new ideas to market.© 2022 All rights reserved. "Forging the Future" podcast, content, title, and logo owned by Softeq. Unauthorized use prohibited. Contact: ftf@speakerboxmedia.com. Respect our creativity. Economics
Episodes
  • AI for Good: How NVIDIA’s Thorsten Stremlau Is Transforming Life for People with Disabilities
    Apr 30 2026
    NVIDIA systems architect Thorsten Stremlau sees “AI for good” as a way to restore core human abilities like communication, independence, and participation, especially for people with disabilities. Inspired by his work with Stephen Hawking and Peter Scott- Morgan, he’s helped build affordable, AI-powered tools from eye-gaze systems to personalized language models that run on everyday devices like smartphones. He contrasts these non-invasive solutions with brain-computer interfaces from companies like Neuralink, highlighting that while implants may unlock richer interaction in the future, today’s priority is scalable tech that already improves lives. He also explores how AI can support areas beyond ALS, including heart disease, PTSD, and mobility. At its core, his work is about using AI to create a more inclusive world where disability doesn’t mean disconnection. 🎧Episode Highlights [01:24]: Thorsten’s early work with nonverbal disabled youth [07:41]: AI restoring communication, autonomy, participation [18:09]: Working with Stephen Hawking and Peter Scott-Morgan [29:55]: Low-cost eye-gaze and circular-keyboard communication [43:12]: Beyond ALS: speech, PTSD, AI prosthetics 🔑 Key Takeaways: ● AI for good is ultimately about restoring human agency, not just boosting efficiency. By focusing on communication, autonomy, and participation, AI systems can give people with disabilities the ability to express themselves, make choices, and engage with the world on their own terms, rather than being defined by their limitations. ● The most impactful assistive technologies are built on mainstream, affordable hardware instead of specialized, high-cost rigs. Eye-gaze interfaces, circular keyboards, personalized language models, and speech-decoding systems that run on laptops and smartphones dramatically expand access, making advanced assistive tech viable not just for a few patients in wealthy systems, but for millions of people worldwide. ● Non-invasive AI solutions are a powerful bridge to the future of human–computer interaction, even as brain–computer interfaces rapidly advance. By combining clever sensing (eyes, face, voice, heart rate, touch), behavior modeling, and personalized AI, we can already enable richer communication, calmer nervous systems, and more natural movement, laying the groundwork for a world where disability no longer means disconnection from work, creativity, or community. 👤 Guest Spotlight: Thorsten Stremlau Thorsten Stremlau is a Principal Systems Architect at NVIDIA and a technology leader focused on building next-generation computing platforms and customer-centered innovation. Previously with Lenovo and IBM, he holds 30+ patents and is widely recognized as a thought leader in platform security and advanced systems architecture, known for turning complex technical challenges into trusted, market-ready solutions. Stay Connected: ● https://www.softeq.com ● https://www.linkedin.com/in/techris ● https://www.linkedin.com/in/thorsten-stremlau-247930 ● https://www.nvidia.com Stay inspired and ahead of the curve by subscribing to Forging the Future. Share your thoughts on this episode with the hashtag #ForgingTheFuture or tag us online!
    Show More Show Less
    52 mins
  • The “SaaS-pocalypse” Is Here: Equipt.ai’s Amanpreet Kaur on How AI Is Rebuilding Software
    Apr 9 2026
    Episode Summary: Amanpreet Kaur, Chief AI and Technology Officer of Equipt.ai, breaks down the so-called “SaaS-pocalypse” and argues that SaaS isn’t dying, it’s evolving into smarter, AI-embedded systems of execution. Amanpreet explains how traditional SaaS created fragmentation and operational friction, and how Equipt.ai is rebuilding the stack by unifying workflows, data, and decision-making into a single AI-driven execution layer. Drawing from her experience in asset-heavy industries like energy, Aman shares how their platform replaces multiple disconnected tools while enabling real-time, guided operations from quote to cash. She also reflects on the early startup journey, from landing their first enterprise customer to navigating investor skepticism and refining their positioning. A key shift is undoubtedly taking place: the future belongs to agentic SaaS platforms that reduce human intervention and turn software into an active driver of business outcomes. 🎧 Episode Highlights [02:09]: SaaS solved infrastructure but created fragmentation [04:59]: “Dumb SaaS is dying:” AI-powered execution rises [06:17]: Replacing 10-20 tools with one platform [10:44]: AI as “steroids” for SaaS, not replacement [14:38]: First enterprise client before having a product [23:13]: Embedding AI across the full workflow 🔑 Key Takeaways: The “SaaS-pocalypse” isn’t about SaaS disappearing, it’s about a shift from systems of record to systems of execution. Traditional SaaS created fragmented workflows and heavy reliance on human coordination, while AI-enabled platforms are now unifying data and driving real-time decisions directly within operations. AI’s real value is not replacing software, but embedding intelligence into every layer of it. From automation and optimization to predictive insights, the winning approach is combining machine learning, simple automation, and context-aware AI to reduce manual work and enable more deterministic, reliable outcomes. Startups that succeed in this shift will focus on solving real operational problems, not just adding AI for hype. Equipt.ai’s journey shows that deep industry understanding, clear pain points, and delivering immediate value to customers matter more than technology trends, especially when building trust and scaling from early enterprise clients. 👤 Guest Spotlight:Amanpreet KaurAmanpreet Kaur is the co-founder as well as the Chief AI and Technology Officer of Equipt.ai, where she builds AI-powered operational platforms for asset-intensive industries. With a background in energy and industrial technology, she previously led the development and commercialization of digital solutions, including enterprise-scale platforms and digital twin systems. Kaur brings deep domain expertise and a product-first mindset to redefining how businesses move from fragmented SaaS tools to AI-driven systems of execution. Stay Connected: https://www.softeq.com/ https://www.linkedin.com/in/techris/ https://www.linkedin.com/in/amanpreet-hon-doc/ https://www.equipt.ai/ Stay inspired and ahead of the curve by subscribing to Forging the Future. Share your thoughts on this episode with the hashtag #ForgingTheFuture or tag us online!
    Show More Show Less
    33 mins
  • What If AI Worked More Like the Human Brain? ft. Chris Eliasmith of Applied Brain Research
    Mar 19 2026
    At CES 2026, we sat down with Chris Eliasmith, CTO of Applied Brain Research, to discuss how brain-inspired AI is enabling fast, low-power voice interfaces that run directly on edge devices. Drawing on research modeling the hippocampus, his team developed new neural network architectures that significantly improve efficiency and accuracy for tasks like speech recognition and text to speech. These advances allow devices such as AR glasses, robots, and wearables to respond to voice commands in under 300 milliseconds, creating interactions that feel natural and conversational. Eliasmith also explains the tradeoffs between model size, accuracy, and power consumption, and how running AI at the edge can reduce costs and reliance on the cloud. He ultimately envisions a future where complete AI agents run locally on small devices, making technology simpler and more accessible for everyday users. 🎧 Episode Highlights: ●[01:59]: Introducing ultra-low-power voice AI at the edge ●[03:27]: Why 300ms latency is critical for natural conversations ●[09:06]: Brain-inspired neural networks modeled after the hippocampus ●[15:02]: Tiny AI chips for AR glasses, robotics, and wearables ●[20:25]: Cutting cloud costs with local speech processing ●[27:54]: The future of full AI agents running at the edge 🔑 Key Takeaways: ● By modeling neural networks after how parts of the brain like the hippocampus process time-based information, researchers can build AI systems that achieve higher accuracy with far fewer parameters. This approach allows models to process speech and other signals more efficiently, making advanced AI practical even on small, resource-constrained devices. ● For voice interfaces to feel natural, responses must happen within roughly 300 milliseconds, the same timing humans expect in conversation. Designing AI systems that meet this latency requirement changes how models are built and deployed, pushing developers to prioritize real-time performance rather than relying on slower cloud-based processing. ● Low-power AI that operates directly on devices reduces reliance on internet connectivity, lowers operational costs, and improves responsiveness. As models become efficient enough to run locally, entire AI agents could operate on wearables, robotics platforms, and AR devices, simplifying technology and making intelligent interfaces accessible to more users. 👤 Guest Spotlight: Chris Eliasmith Chris Eliasmith is the Director of the Centre for Theoretical Neuroscience at the University of Waterloo and holds the Canada Research Chair in Theoretical Neuroscience. He is also the CTO and co-founder of Applied Brain Research, where he works on low-power AI technologies for machine learning, robotics, and edge computing. Eliasmith is the co-inventor of the Neural Engineering Framework, the Nengo software platform, and the Semantic Pointer Architecture, and is the author of How to Build a Brain (Oxford University Press) and Neural Engineering (MIT Press). Stay Connected: ●https://www.softeq.com/ ●https://www.linkedin.com/in/techris/ ●https://www.linkedin.com/in/chris-eliasmith/ ●https://www.linkedin.com/company/applied-brain-research/
    Show More Show Less
    30 mins
No reviews yet