TestGuild Automation Testing Podcast

I'm thrilled to have two titans in software testing, Matt Heusser and Michael Larsen, with us today. These veterans, with their wealth of experience and knowledge, are here to discuss their latest contribution to the testing community, their new book, "Software Testing Strategies."

In today's episode, we will unpack the inspiration behind "Software Testing Strategies," exploring the trio of testing essentials: skills, strategy, and the nuances of day-to-day operations, including the politics that intertwine with the testing process. The authors will discuss their approach to addressing the complexities of software testing, finding the most effective tests among endless possibilities, and how their book aims to guide you through these challenges.

Matt and Michael will also share critical insights into organizational dynamics, the value of adaptability in the testing realm, and the ethical considerations professionals face in their careers. Plus, we'll touch on the difficult journey of updating outdated systems, navigating the minefield of communication, and why terms like "QA" may need a rethink.

Listeners, you're in for a treat, with real-world stories, practical advice, and invaluable expertise that's just a discount code away – so stay tuned as we dive into the world of "Software Testing Strategies" on the TestGuild Automation Podcast.

Direct download: tgamichaelAwesomeSoftwareTestingStrategies491.mp3
Category:general -- posted at: 1:45pm EDT

Today's special episode, "Robocon Recapp," is about the insights and highlights from Robocon 2024. We are privileged to have Tatu Aalto, a renowned maintainer of the browser library; Frank Van der Kuur, a distinguished Robot Framework trainer from BQA; and Mark Moberts, a well-known figure in the Robot Framework community, with us.

In this episode, our guests will explore the enriching experiences of the conference—from the unveiling of the Market Square to the engaging sessions that sparked valuable discussions. We will explore the myriad contributions beyond programming, including documentation, testing, and being an active voice in the community through forums like the RobotFramework Slack channel.

Throughout the Robocon, the spirit of collaboration and knowledge exchange was not just evident. Still, it was the driving force - whether it was addressing pitfalls in the framework, swapping tips on finding the right testing library, or even discussing Frank's and Tatu's interactive sessions that went beyond expectations with engaging questions and the impact they left on the audience.

Get ready to be immersed in a conversation that not only recaps the energy and learning from Robocon but also showcases how the Robot Framework community is driving the future of test automation. So plug in as we dive into everything Robot Framework with insights from the experts at the forefront of the automation world. Listen up!

Direct download: tgatatuRoboConRecapTestingNetworkingandBuildingwithRobotFramework490.mp3
Category:general -- posted at: 3:17pm EDT

Today, we're diving deep into the world of automation testing, with a special focus on Netflix's innovative tool, SafeTest. Joining us is Moshe Kolodny, the senior full-stack engineer at Netflix who is behind this exciting new tool, which is bridging the gap between end-to-end and unit testing.

SafeTest, a tool that's been making significant strides in the industry, has garnered widespread community support and impressive traction in a remarkably short time. As we delve into its capabilities, we'll discover how SafeTest seamlessly integrates with popular libraries like Playwright and Jest, offering robust testing capabilities without imposing intrusive dependencies.

Moshe will delve into the philosophy behind SafeTest, underlining the importance of practical, iterative test writing and the pitfalls of over-engineering. We'll explore SafeTest's adaptability, which ensures test consistency across environments with Docker mode, and the bidirectional communication it enables between browser and Node.js, enhancing the overall testing experience.

Our conversation will shed light on the exciting future of SafeTest, from potential additions to the test runner to the introduction of custom reporting features. Moshe will also underscore the tool's commitment to developer experience, exemplified by SafeTest's debugging aids like videos and trace viewers.

It's no secret that SafeTest reflects Netflix's robust approach to quality assurance. It aligns closely with the day-to-day experiences of UI engineers and addresses the intricate challenges of complex user interactions and service integrations.

Stay tuned as we unpack the story of SafeTest's inception, core features, practical applications, and why Moshe believes it's a versatile choice for most testing scenarios.

Direct download: tgaMoshyNetflixSafeTest489.mp3
Category:general -- posted at: 12:23pm EDT

In this episode, we're diving deep into the world of performance engineering with our esteemed guest, Dylan van Iersel, an experienced IT consultant and co-founder of Perfana. We'll explore the intricate relationship between software performance and business outcomes and how tools like Perfana can democratize and simplify the process of performance testing.

Performance is more than just a technical concern; it has direct implications for customer satisfaction and the bottom line. Dylan illuminates the importance of integrating performance testing within the CI/CD pipeline, using Perfana to serve as a quality gate and provide actionable insights through automated analysis and dashboard visualizations.

We'll also discuss the evolution of performance engineering in a cloud-native, containerized landscape, the challenges of scaling performance testing across agile teams, and why the "shift left" approach in identifying issues early is crucial for today's development processes.

For teams looking to embrace performance testing, Dylan introduces Perfana's starter package and emphasizes the ease of getting up and running, even on a local laptop, as a foundation for more extensive integration into test environments and CI/CD pipelines.

For our listeners interested in cutting-edge developments, we dive into how Perfana innovates with data science and machine learning to enhance anomaly detection and root cause analysis. Plus, we'll get into the nitty-gritty of why observability, while important, shouldn't be your sole resource for performance testing.

Listen to discover actionable advice and insights on improving your team's performance engineering efforts if you want to learn more about Perfana to try their free trial now!

Direct download: tgaDylanAgileAutomatedAdvancedTheNewAgeofPerformanceTesting488.mp3
Category:general -- posted at: 8:14pm EDT

In this episode, Dave Piacente, a senior community manager in developer relations and community expert at Applitools, joins us to talk about redefining test automation.

There are a common set of techniques seasoned test automation practitioners know to be the pillars of any successful automated testing practice that can fit into most (if not all) team contexts.

But some things have either been left out or are described in ways that are disadvantageous for the industry simply because we need to talk about it from the perspective of the fundamentals used to craft them.

By reasoning from first principles, we can unearth a more impactful definition of test automation that can act as a compass and help supercharge everyone's test automation practice - while also understanding how to navigate the uncharted waters of new technologies that challenge existing paradigms.

Direct download: tgaDaveAG24AfterTicket487-auphonic.mp3
Category:general -- posted at: 1:48pm EDT

1