TestBash Brighton 2018

Testbash brighton 2018 adverts dojo event banner

TestBash Brighton, the original and BIGGEST TestBash was back on 15th - 16th March 2018! It was a fun-filled and jam-packed two days with plenty to learn, much to take home and many opportunities to make new tester friends!

We hosted over 300 software testers from across the globe, had 11 amazing talks, lots of awesome 99 second talks and plenty of socials. Plus our first ever UnExpo!

We record all our TestBash talks and make them available on The Dojo. Some are free to watch and others require Pro Membership. Here are all the TestBash Brighton talks, get stuck in!

Join the discussion about TestBash Brighton over at The Club.

We would like to thank our TestBash Brighton 2018 event sponsors, TestSphere, Scott Logic, TAB and Cambridge Consultants for supporting this software testing conference and the software testing community.

If you would like to attend TestBash or any of our events then please check our latest schedule on our events pages.

Watch all the talks from the event:
series_still
PRO
Event Sponsors:
Training

Summary

Do you think that testing can be more than writing and executing test cases? That there is more to testing than just checking requirements? Kick your testing up a gear and discover how challenging and invigorating advanced testing can be. Lean Testing teaches a collaborative and investigative test approach that will reignite your passion and improve how you test.

Description

This course focuses on how to make testing more valuable by teaching testers how to test better. This intensive one day course introduces a variety of proven techniques for testers to complete their work faster and to a higher standard, using critical thinking and advanced test analysis techniques. The course covers the whole test process, from planning to execution and gives practical examples at each step.

Learning Outcomes

While mastering testing skills will require both time and practice, this workshop will allow you to:

  • Apply heuristic test techniques
  • Develop and critique visual coverage models
  • Recognise and explain the oracles used to identify bugs
  • Write specific and effective test charters to focus exploratory testing
  • Gain awareness of how to write descriptive test reports and see examples of lean testing documentation
  • Conduct feature tours to produce state transition diagrams
  • Use Hendrickson variables to identify variance classes
  • Gain knowledge of combinatorial test design
  • Identify cognitive biases that can affect your testing
  • Advocate for exploratory testing in your test approach

Agenda

All participants apply the concepts introduced through extensive practical activities that allow them to experience everything learned. Interactive testing exercises encourage group collaboration and get attendees working hands-on testing real software.

Prerequisites

A laptop with a USB drive that you have administrator access to.

Aaron Hodder
Nnoyar9g 400x400

Aaron Hodder hails from Wellington, New Zealand, where he works for Assurity Consulting to coach testers to develop and deliver new and innovative testing practices to better suit the demands of modern-day software development.

Aaron is a passionate software tester with a particular enthusiasm for visual test modelling and structured exploratory testing techniques. He regularly blogs and tweets about testing and is a co-founder of Wellington Testing Workshops.


Workshops
Morning Sessions

This is not a workshop. This is a talk that all attendees of the workshop day will attend in the main auditorium.

Sitting at your desk, testing another user story, you think of changing the process by adding more automated scripts to free your time for more exploratory testing. You propose end-to-end test automation on top of the small scale automation you already have, only to realize the attitudes are “that will not work here, we tried it before”. You still believe in it strongly, so where to go when you’re told not to do it?

In this talk, I will share my story of being the tester whose proposal got turned down for previous bad experiences. Instead of not doing the needed end-to-end automation, I made sure I did not compromise my regular work but still make time for working on this. Showing it can be done changed “it won’t work” to “This is really helpful”. Join me for learning how to break barriers for the things you believe in.

Takeaways:

  • How consistent use of an hour a day enables you to do things you feel strongly about without compromising your other work
  • Importance of believing in yourself when others in the team are less aware of your abilities
  • How showing the results wins over past experiences of the team

Bhagya Perera
Bhagya

Bhagya Perera is a software test analyst for almost 10 years after changing the career from being a software developer. She is originally from Sri Lanka, currently lives, and works in England.

Bhagya is passionate on testing and brings experience on working in multi-cultural distributed teams as well as experimenting on automation and manual testing. In addition, she has strong interests in leadership, effective communication and mentoring that broadens her knowledge.


Much of exploratory testing tends to concentrate on the client UI. But the same strategies that work well at the UI level can also be applied to the web services and APIs that power your application. Exploratory testing your web services has multiple benefits. By exploring your APIs you can:

  • Find critical bugs earlier in the development cycle
  • Gain a deeper understanding of how your application or feature works
  • Have greater confidence in your UI level testing, knowing that the services underneath are solid

In this hands-on workshop using both paper and laptop exercises, participants will learn:

  • How to ask the right questions about services and API design
  • Freely available tools for exploring web services
  • Tips and tricks for finding service-level issues

Get beyond the surface of your applications and discover how fun and rewarding API testing can be!

Workshop attendees will learn how exploratory testing can extend beyond the application layer so they can find and potentially fix issues earlier in the development cycle. The workshop will include multiple activities so that attendees can gain confidence in their ability to test web services.

Amber Race
Amberrace headshot

Amber Race is a Senior SDET at Big Fish Games.

After majoring in Asian Studies, teaching in Japan, and travelling the world, she stumbled into software testing and has been loving it ever since. She has over 15 years of testing experience at Big Fish and Microsoft, doing everything from manual application testing to tools development to writing automation frameworks for web services.

Amber has worked on a wide variety of products and written automation in C#, C++, Python, and Java.

She currently specialises in test automation and performance testing for high volume back-end services supporting iOS and Android games.


So many interviews are about selling yourself. It feels disingenuous, misleading. The candidate is saying what the interviewer wants to hear. The candidate doesn’t get a good idea of what they’re expected to know on day one. They want to know what it’s like to work with those people on that team everyday. The interviewer’s touting the company line and asking questions about a resume stacked with buzzwords. They want to parse out how much the candidate witnessed vs. contributed to a project. Interviewing is typically a one-way street. But it needs to go both ways.

Pair testing in interviews gets rid of this artifice. Martin and Elizabeth join the interviewee in the room. We catch the candidate up with our plan: other people are going to talk to them about their resume; we want to explore a website together to see a bit more about how they think and what they can find. Elizabeth helps navigate the bug-riddled website. “What are you noticing about that? Talk me through what you’re thinking.” Martin asks the meta-questions. “How much more time would you need to test this? What kind of tools or automation would help make you more productive?”

Interviewers need to treat interviews like exploratory test sessions. They can’t use the same script for every person, just like testers can’t use the same script for every test. Interviewers gather evidence and uncover new information about how the candidate thinks. Candidates stop trying to sell themselves and demonstrate their skills instead. Martin and Elizabeth are there for the triumphs and the struggles. They celebrate when a candidate discovers and reports their first bug. They scream internally when a “Did you see that?” isn’t a heuristic for a candidate to dig deeper. They observe the behaviors, and change their actions as a result. Hopefully the candidate does too. By the end of the session, everyone has a better understanding of whether or not this role is a good fit.

We’re going to foster an environment where everyone can be their genuine, authentic selves. We’re going to have attendees practice how to:

  • Describe a decision or a train of thought
  • Encourage a line of thought without giving too much away
  • Redirect someone onto a different point of focus
  • Reevaluate expectations
  • Deliver critical feedback concisely

Martin Hynie
Martinhynie

With over fifteen years of specialization in software testing and development, Martin Hynie’s attention has gradually focused towards embracing uncertainty, and redefining testing as a critical research activity. The greatest gains in quality can be found when we emphasize communication, team development, business alignment and organizational learning.

A self-confessed conference junkie, Martin travels the world incorporating ideas introduced by various sources of inspiration (including Cynefin, complexity theory, context-driven testing, the Satir Model, Pragmatic Marketing, trading zones, agile principles, and progressive movement training) to help teams iteratively learn, to embrace failures as opportunities and to simply enjoy working together.


Elizabeth Zagroba
Elizabethzagroba1

Elizabeth tests software at Mendix in Rotterdam. In the course of her career, she’s tested web apps, mobile apps, APIs, and content management systems. Her article about mind maps became one of the most viewed on the Ministry of Testing Dojo in 2017. She spoke about moonwalking at TestBash New York in 2015, succeeding as an introvert at TestBash Philadelphia in 2016, and how less can be more with Diana Wendruff at TestBash Brighton earlier this year.


We know that application security is important. We have to protect our customers' data and our employers' data while keeping our systems up and running. But do we have the skills and knowledge to meet that challenge?

During this workshop, we will begin to explore some of the concepts, skills, and techniques of security testing by working with a vulnerable web application. Through practical activities and hands-on learning, we will discover the key security issues that affect web applications today.

Testers will learn skills to identify software vulnerabilities and understand common threats and risks that occur in web-applications. We will also examine some of the tools and utilities that can enhance and extend security testing efforts. Let's look at the essential steps to build and execute your own security testing strategies. Let's examine how learning and mentoring can aid in the development of strategies. You can and should build up your own skills with integrated security testing. This will ensure ongoing relevance of your role in a security context, and the success of your organisations.

Building upon personal experience of integrating security testing into an existing organisation, incorporating DevOps, continuous delivery and integration, this workshop will highlight and discuss the reflections of learning from hackers, recent breaches and the socio-economic, political and technical impact upon software development organisations.

Attendees will take away a set of advice and techniques to incorporate and enable security testing into their day to day work, answering some of the questions that may arise around scope, skills, tools, models and learning.

Technical requirements: This is a practical workshop, so all attendees will require a laptop, and the ability to install and run the application under test, as well as some open source tools that will be useful during the session. Installation instructions and a tool list will be sent before the workshop, and pre-installation is highly recommended for a smooth workshop experience.

Prior experience in security testing web applications is not necessary; however, attendees will need to be comfortable testing web applications and using modern web-browsers (i.e. Firefox, Chrome, Safari).

Takeaways:

  • Understanding of key security risks, threats and vulnerabilities
  • Learn and practice security testing skills in a safe space
  • Development of the security mindset
  • Dan Billing
    Unnamed %281%29

    Dan Billing has been a tester for 17 years, working within a diverse range of development organisations, mostly in the south-west of England. He is now running his own consultancy, The Test Doctor, based near Brighton in Sussex. His passions in testing include mentoring, supporting and training members of the team to develop their security skills also.

    Dan’s love of testing drives me to become an active member of the testing community, helping to organise local testing events and learning. He is also a co-host of the podcast Screen Testing, alongside Neil Studd.


    Laptops or tablets required for this tutorial

    Many organizations find that system level test automation does not work as well as they thought it would - the magic didn’t happen for them. In many cases, these failures are due to generic technical reasons, which can be fixed relatively easily. These test automation patterns are common to automation efforts at any level with whatever tools you are using. We focus on often-neglected technical issues (i.e. not primarily management issues) and the patterns that help solve them. We look at issues such as BRITTLE SCRIPTS, INADEQUATE DOCUMENTATION, and UNFOCUSED AUTOMATION and discuss patterns such as TESTWARE ARCHITECTURE, DOCUMENT THE TESTWARE, AUTOMATE WHAT’S NEEDED, INDEPENDENT TEST CASES and TOOL INDEPENDENCE, as well as other issues and patterns that delegates want to investigate. Learn how to navigate efficiently through the patterns documented on the Test Automation Patterns Wiki, and develop a better understanding of technical test automation challenges and solutions.

    Bring your laptop or internet-enabled tablet to gain access to the wiki during the tutorial.

    The tutorial uses a mix of lecture, exercises and group discussion to explore the wiki and find solutions for common issues. A few selected patterns are covered in depth, and there is time for delegates to address the issues and problems they most want to learn more about.

    Outline of tutorial:

    • Introduction & delegate issues
    • Test Automation Issues and Patterns
    • Using the wiki
    • Patterns covered:
      • TESTWARE ARCHITECTURE
      • DATA-DRIVEN vs KEYWORD-DRIVEN
      • DOCUMENT THE TESTWARE
      • AUTOMATE WHAT’S NEEDED
      • INDEPENDENT TEST CASES
      • TOOL INDEPENDENCE
      • COMPARISON DESIGN
      • EXPECTED FAIL STATUS
    • Exploring of issues and patterns most relevant to delegates

    Dorothy Graham
    2016 dot

    Dorothy Graham has been in software testing for over 40 years, and is co-author of 4 books: Software Inspection, Software Test Automation, Foundations of Software Testing and Experiences of Test Automation. She is currently working on a wiki on Test Automation Patterns with Seretta Gamba.

    Dot is a popular speaker at international conferences world-wide. She has been on the boards of many conferences and publications in software testing, and was programme chair for EuroSTAR in 1993 (the first) and 2009. She was a founder member of the ISEB Software Testing Board and was a member of the working party that developed the ISTQB Foundation Syllabus. She founded Grove Consultants and provided training and consultancy in software testing for many years, returning to being an independent consultant in 2008.

    She was awarded the European Excellence Award in Software Testing in 1999 and the first ISTQB Excellence Award in 2012.


    Seretta Gamba

    Seretta Gamba has forty years of experience in software development. As test manager at ISS Software GmbH, she was charged in 2001 with implementing test automation. After studying the then current strategies, she developed a kind of keyword-driven testing and a framework to support it. In 2009, the framework was extended to support manual testing. Speaking about this at EuroSTAR, Seretta got the attention of Dorothy Graham who subsequently invited her to contribute a chapter to the book Experiences of Test Automation. After reading the entire book, Seretta noticed recurring patterns in solving automation problems and began to write a book on test automation patterns. She was soon joined by Dorothy and together they developed the Test Automation Patterns wiki.

    Together with Dorothy or alone, Seretta has held tutorials and talks about test automation and especially Test Automation Patterns at major conferences (STAR East & West, EuroSTAR, etc)


    Afternoon Sessions

    As testers it’s our job to not only learn about our products and projects but share that information with others to allow them to make informed decisions. However, communication and reporting techniques are skills that testers often forget to practise and improve. Sometimes a tester needs to ask themselves what’s the best way to communicate with a team, what style of note taking works best for them and how do they report information clearly in a timely manner.

    Dan and Mark’s interactive session offers exercises, examples and discussion points on how to:

    • Describe different forms of communication and why communication is important
    • Discuss the challenges surrounding communication and how to overcome them
    • Apply communication techniques to support your testing
    • Contrast different note taking styles and determine the right one for you
    • Explain your testing activities and what you have learnt during testing

    By the end of the session you will be able to communicate successfully and record/report your testing in a way that is clear, concise and effective for others to act upon.

    Dan Ashby
    Danashby Dan is a SW Tester and he likes Porridge! (and whisky!)
    Mark Winteringham
    Markwinteringham

    I am a tester, coach, mentor, teacher and international speaker, presenting workshops and talks on technical testing techniques. I’ve worked on award winning projects across a wide variety of technology sectors ranging from broadcast, digital, financial and public sector working with various Web, mobile and desktop technologies.

    I’m an expert in technical testing and test automation and a passionate advocate of risk-based automation and automation in testing practices which I regularly blog about at mwtestconsultancy.co.uk and the co-founder of the Software Testing Clinic. in London, a regular workshop for new and junior testers to receive free mentoring and lessons in software testing. I also have a keen interest in various technologies, developing new apps and Internet of thing devices regularly. You can get in touch with me on twitter: @2bittester


    Agile working and cross-functional teams have the ability to silo organisations into teams, programmes and functions. This leads to duplication or work, a reduction in sharing knowledge and worse cuts people off from their support network. At a time when organisations are scaling, structures are flattening and workforces are increasingly fluid, supporting and connecting people is more important than ever. This is where communities of practice come in.

    In this workshop, Emily will take you through the value of communities of practice, what needs to be in place and the steps towards creating successful communities. This workshop is for people wanting to start communities of practice as well as those that already have communities and want to increase their value for members and their organisations.

    Emily Webber
    Emilywebber

    Emily Webber has been working with Agile teams and organisations for a number of years. She has a breadth of experience of delivery and agile transformation in both the private and public sectors.

    She was the Head of Agile Delivery at Government Digital Service (GDS), where amongst other things she built, developed and led an amazing team of ~40 Agile delivery professionals. While doing this, she created the model used by many organisations for developing communities of practice. She has taken this model and applied it to organisations including Department for work and Pensions (DWP), Ministry of Justice (MoJ), Department for Environment, Food and Rural Affairs (Defra) and Co-op Digital. As well as capturing it in her book 'Building Successful Communities of Practice'.

    She is always seeking opportunities to give back to the Agile community and co-founded Agile on the Bench; a meet-up in London and a one-day Agile conference and is often found speaking at Agile and Lean conferences and meet-ups.

    She is passionate about teams, communities, organisational learning and skills development, she blogs at emilywebber.co.uk and has a weak spot for vintage scooters.


    Good stories last: They release emotions, they have their own life, they make you proud and they can help you inform and move others. This is why we embarked on our quest: the quest for the Ultimate Test Story! We set forth to scour the earth in search of testers. Testers with interesting experiences and crazy stories to tell, but not having the words to bring them to life. Our weapon of choice: Test Sphere. A deck of hundred cards that inspires and supports these testers to craft, temper and shape their raw experiences into strong, red-hot stories of power.

    In our workshop we will teach you how to use these cards, by giving you different assignments in small groups. Most importantly: this means you are going to tell test stories. We are convinced that every tester has had interesting experiences, that deserve to be told and shared. After telling each other these stories, you can start to discuss the content of the story and the performance of the storyteller:

    Do you have similar experiences?

    Would you do something differently?

    Do you have a challenging question?

    How can the story be told in a more captivating way?

    After you finished the discussion, you can reward each other by giving a small gift in the form of a sticker to show your appreciation of sharing a story and insights.

    Content is vital, but it is also important to look at your performance: “how to improve your storytelling”. A good story is priceless: It can help you convince your manager or drive home your point in a coaching session. In our workshop it is crucial that you give each other feedback on how to tell a good story.

    Completing the workshop, you will have done many interesting activities: you’ll have heard and told lots of stories and have received feedback on how to narrate a great story. Maybe you were rewarded for your positive participation. Challenge yourself and your test colleagues by telling interesting test stories and who knows; maybe you can help us to finish our quest by performing for us: The Ultimate Test Story.

    Stay awhile and listen. For the best place by the fire, is kept for,... the storyteller.

    Attendees will have:

    • Explained multiple stories about testing and defended decisions under scrutiny of their peers;
    • Taken stories from other peers under rigorous questioning and used that to teach as well as learn;
    • Heard multiple heuristics, techniques, patterns, quality aspects and feelings explained;
    • Acquired feedback on how to tell stories;
    • Used the Test Sphere card deck.

    Beren Van Daele
    Beren

    I’m a software tester from Belgium who shapes teams and testers to improve on their work and their understanding of testing. An organizer of BREWT: the Belgian Peer conference & a testing meetup in his hometown: Ghent and speaker at multiple European conferences. Together with the Ministry of Testing I created TestSphere, a card game that gets people thinking & talking about testing.

    My dream is to become a test coach for people that nobody believes in anymore or no longer believe in themselves. People that have motivation and will, but no luck. I want to tap into that potential and give them the opportunity to become kick-ass testers.


    Ard Kramer
    Bvof alten 2017 ard kramer 1

    I am a software tester from the Netherlands and I am working for Alten Nederland since 2008. I call myself a Qualisopher which stands for someone “who loves truth and wisdom and at the same time is decisive to improve man and his environment” . This means I am interested in the world around us, to see what I can learn and I can apply in software testing. That is one of the reason why I tell stories in books and at (test) conferences such as EuroSTAR, Expo:QA, Belgium Testing Days, CAST and Testnet conferences. My dream is to participate, as a good qualisopher, in all kind of projects such as sports, culture or software testing. Projects which add value to our community: I want to inspire other people by cooperation, fun and empathy and hopefully bring light in someone's life.


    Security testing can seem like a daunting task that is best left to external contractors. In the very best case scenario, we hope that the expensive penetration test does not turn up any security issues as this is likely to delay the release.

    This workshop is aimed at mobile application software testers to help understand exactly what it is a penetration tester will do and why for most vulnerabilities, a security expert is not needed. In this 1/2 day workshop, we will look at how security testing can be carried out on Android and iOS applications oftentimes MORE successfully by testers that know the product inside and out.

    Takeaways: How to perform basic security testing of mobile (Android and iOS) applications using tools and techniques not only used by external penetration testers but by integrating security testing into the in house testing process. Although this course will not be a replacement for a full pen test, by performing security tests while the application is being developed, it is hoped that external consultants (when needed) will require less time on a test and find fewer high risk vulnerabilities.

    Jahmel Harris
    Me crop bw

    Jahmel (Jay) is a security researcher and hacker. He co-founded Digital Interruption last year; a security consultancy which helps secure organisations with a mix of penetration testing and helping to embed security into application development pipelines. With a background in not only security testing but software development, Jahmel is able to advise engineers on balancing security with functionality.

    Jahmel has a particular interest in mobile application security, reverse engineering and radio and has presented talks and workshops at home in the UK and abroad. He also runs Manchester Grey Hats – a group aiming to bring hackers together to share knowledge and skills.


    This is not a workshop. This is a talk that all attendees of the workshop day will attend in the main auditorium.

    Have you ever needed to share something difficult with a colleague? And agonised about how they would react? Or been shocked by how they did react?

    This talk is the amusing and awkward tale of what it's like to share something deeply important for the first time. How can you prepare beforehand? How you can respond if you are the colleague? What do you do if one of you puts your foot in your mouth?

    Anusha Nirmalananthan
    At work

    Anusha has worked in the tech industry for over 10 years. Her most recent roles include Head of Growth, Data & AI at JustEat and Head of Product, European Product Development at eBay in London. She likes useful and usable things, and trained with experts in UX from IDEO and Adaptive Path while working for an eReader startup in San Francisco early in her career. She has contributed to a variety of diversity and inclusion programmes in the corporate and voluntary sector. She is currently training part-time in psychotherapy and counselling. She has a Computer Science degree from Queens’ College, Cambridge University.


    Conference

    So, you've heard of expos and unconferences, well this is an unexpo! It's an attendee-driven expo.

    Last year we received some feedback from attendees that they would like more to do during the breaks of TestBash. Expos are what some conferences use to fill any break-time gaps, but expos have never fit in with our ‘user-experience’ goals for TestBash. However, we are driven by what our community wants, love experimenting with new ideas, and so The UnExpo idea was born!

    The UnExpo will run during the morning and afternoon breaks and during the lunch break of TestBash. During this time, attendees will be given the chance to set up a "stand" and share their innovations, testing practices, new tools, testing problems, or thoughts on any topic that will initiate interesting discussions. Other attendees, armed with post-its, pens and awesome ideas, will then be able to visit stands of interest to discuss all things testing!

    Benefits of The UnExpo

    • An opportunity for attendees interested in the same subject to engage in conversation. Who knows what that may lead to future… potential collaborations, job offers, or maybe even a new tester friend!

    • Gain impartial and insightful feedback from peers when running a stand. Critique, both positive and negative, can help you improve ideas and solve problems.

    • A quick and accessible way to see and hear all about what the testing community is up to. What tools are others using? What problems are they encountering? What solutions are they creating? All packaged up in one room, ready and waiting for you.

    Richard Bradshaw
    Richardbradshaw Richard Bradshaw is an experienced tester, consultant and generally a friendly guy. He shares his passion for testing through consulting, training and giving presentation on a variety of topics related to testing. He is a fan of automation that supports testing. With over 10 years testing experience, he has a lot of insights into the world of testing and software development. Richard is a very active member of the testing community, and is currently the FriendlyBoss at The Ministry of Testing. Richard blogs at thefriendlytester.co.uk and tweets as @FriendlyTester. He is also the creator of the YouTube channel, Whiteboard Testing.

    DevOps is taking over the tech world. Now, automated programmable infrastructure is reaching widespread adoption. We’ve also embraced test automation. But there’s a crucial part we’re missing: testing the programmable infrastructure code itself. After all, it’s code too, right?

    Even worse, microservices and other exotic architectures are making deployments ever more complex. The more complex things get, the more bugs we will find. We can’t afford to ignore it for long.

    I worked on a project where we absolutely had to test infrastructure - because our application deployed, configured, and maintained cloud resources. This talk will take you through some of the strategies we developed, some of the problems we encountered, and some thoughts about where the industry might be headed from here.

    Takeaways:

    • Why infrastructure also needs testing
    • Why testers need to care about infrastructure and why it can't just be left to ops
    • What kind of tools are available to test it
    • What approaches you can take to test it
    • What downsides are there to it, and what problems we face now

    Matt Long
    Pasted image 0

    Matt is a QA engineer for eBay, based in London.

    He has worked as a QA consultant for many years, and built test automation frameworks in half a dozen languages. His particular areas of expertise involve cloud infrastructure, serverless architectures, API and web testing. He has presented talks at QCon, muCon, Agile Cambridge, and many smaller meetups across London.

    In his spare time, Matt is interested in indiepop music, pretentious arty videogames and relentlessly quoting The Simpsons. He builds and maintains a machine learning foosball bot in his spare time.


    In this session, Aaron will talk about a recent experience where a test team of business users needed to be coordinated to test a large, complex product in a way that was reportable, legible, and traceable. We didn't want to constrain the business users within the bounds of prescriptive test cases, but we needed to estimate, track, and report on the testing that was done daily.

    Using a combination of kanban, visual test coverage modelling, and managing testing based on sessions, we rose to the challenge and performed testing in a way that was visible and reportable while giving the testers enough freedom to explore and investigate.

    Takeaways:

    • An exposure to a framework for managing exploratory testing using a combination of kanban, SBTM, and Visual Modelling.
    • An understanding into how exploratory testing can be auditable and traceable.

    Aaron Hodder
    Nnoyar9g 400x400

    Aaron Hodder hails from Wellington, New Zealand, where he works for Assurity Consulting to coach testers to develop and deliver new and innovative testing practices to better suit the demands of modern-day software development.

    Aaron is a passionate software tester with a particular enthusiasm for visual test modelling and structured exploratory testing techniques. He regularly blogs and tweets about testing and is a co-founder of Wellington Testing Workshops.


    Agile working and cross-functional teams have the ability to silo organisations into teams, programmes and functions. This leads to duplication or work, a reduction in sharing knowledge and worse cuts people off from their support network. At a time when organisations are scaling, structures are flattening and workforces are increasingly fluid, supporting and connecting people is more important than ever. This is where communities of practice come in.

    Communities of practice have many valuable benefits for both individuals and organisations. In this session, Emily will draw from her experiences of developing communities of practice at the Government Digital Service, government departments and other organisations as well as case studies from her ongoing research into this area. To show you why communities of practice are a vital piece of your agile organisation and what role they can play.

    Emily Webber
    Emilywebber

    Emily Webber has been working with Agile teams and organisations for a number of years. She has a breadth of experience of delivery and agile transformation in both the private and public sectors.

    She was the Head of Agile Delivery at Government Digital Service (GDS), where amongst other things she built, developed and led an amazing team of ~40 Agile delivery professionals. While doing this, she created the model used by many organisations for developing communities of practice. She has taken this model and applied it to organisations including Department for work and Pensions (DWP), Ministry of Justice (MoJ), Department for Environment, Food and Rural Affairs (Defra) and Co-op Digital. As well as capturing it in her book 'Building Successful Communities of Practice'.

    She is always seeking opportunities to give back to the Agile community and co-founded Agile on the Bench; a meet-up in London and a one-day Agile conference and is often found speaking at Agile and Lean conferences and meet-ups.

    She is passionate about teams, communities, organisational learning and skills development, she blogs at emilywebber.co.uk and has a weak spot for vintage scooters.


    While software testing has always been changing and evolving, the changes many of us are seeing recently go well beyond the scope of changes we’ve seen in the last two (or more) decades. Independent test teams are fading in favour of testers embedded into the development team. Large portions of automation are now owned by developers. Data analysis and monitoring are taking on a prevalent role. Technical skills well beyond writing code are becoming critical knowledge. The scope and breadth of the test role is requiring more and more expertise and depth of knowledge.

    Alan Page has led (and is leading) teams through transformation to modern, advanced testing. In a keynote filled with experiences, anecdotes, and practical examples – as well as warnings about potential traps, he shares everything he knows (or at least everything he can fit into this session) about modern testing in 2018 and what it means to every software tester.

    Takeaways:

  • Knowledge of technical skills beyond automation valuable to testers
  • Ideas and examples of "new" tasks and approaches that testers can use to provide value to their team
  • Examples of how to lead changes in testing on your own teams
  • Alan Page
    Headshot

    Alan Page has been a software tester for over 25 years, and is currently the Director of Quality for Services at Unity Technologies. Previous to Unity, Alan spent 22 years at Microsoft working on projects spanning the company - including a two year position as Microsoft's Director of Test Excellence.

    Alan was the lead author of the book "How We Test Software at Microsoft", contributed chapters for "Beautiful Testing", and "Experiences of Test Automation: Case Studies of Software Test Automation". His latest ebook (which may or may not be updated soon) is a collection of essays on test automation called "The A Word: Under the Covers of Test Automation", and is available on leanpub.

    Alan also writes on his blog (http://angryweasel.com/blog), podcasts (http://angryweasel.com/ABTesting), and shares shorter thoughts on twitter (@alanpage).


    A colleague attempts to compliment you, but insults you in the process. You ask someone a yes or no question, and they explain the system to you for 45 minutes. You open a ticket to find a pasted email chain, unedited. At times we all can make conversations too lengthy, or without adequate forethought. The result can be that we may come across as fake or insincere. Why does any of this need to be communicated?

    In this talk, we want to impart that experience onto others. We’ll present techniques for how people can quickly come to a common understanding. We’ll share examples of how a short interaction doesn’t have to be superficial. You’ll find out what you can you say to a confidant that you might not want to say to an acquaintance. We’ll engage you in learning about self-awareness, social norms, and judgement. You’ll know how to be succinct quickly so less can be more.

    You’ll understand:

    • Which conversations to have
    • When to be a listener
    • When to think things through

    Elizabeth Zagroba
    Elizabethzagroba1

    Elizabeth tests software at Mendix in Rotterdam. In the course of her career, she’s tested web apps, mobile apps, APIs, and content management systems. Her article about mind maps became one of the most viewed on the Ministry of Testing Dojo in 2017. She spoke about moonwalking at TestBash New York in 2015, succeeding as an introvert at TestBash Philadelphia in 2016, and how less can be more with Diana Wendruff at TestBash Brighton earlier this year.


    Diana Wendruff
    Diana

    Diana is the lead tester at Loop Returns. She’s tested the complex process of authoring admin portals, 3rd party integrations, and how websites work for the end user. In her free time she likes to bike, hike, and explore in nature.


    Testing happens in computer labs, and at desks, and on mobile phones, and anywhere else with computers. Testing, like the broader field of software development, is a discipline grown by practitioners. It isn't the only place ideas develop. The lessons we learn testing filter their way into training courses and academic instruction, and likewise, researchers use those courses to pass ideas down to us, on the field.

    This talk will discuss the current state of academia. How does it handle testing: what's missing, what's good, and what's bad. It will ask, "should we care," and talk about how we can make things better.

    Takeaways: Participants will leave with a better understanding of what universities have to say about testing, and for many, a better understanding of how academic research works. They'll be thinking about how their ideas spread, and how we can contribute to a smarter world for testing.

    Geoff Loken
    Geoff loken head

    Geoff Loken has been in QA (or testing, if you prefer) for approximately ten years, and is currently serving as Quality Assurance Coordinator at Athabasca University. He holds a Master of Arts degree in History from the University of Regina, and is in the final stages of a Master of Science in Computing and Information Systems from Athabasca University. He lives happily in northern Alberta, but escapes once or twice a year to travel or talk at conferences. Find him on twitter @geoffloken


    Learning something new is hard! Learning which new testing related subject to focus on is super difficult!

    I've never been a natural learner, I have a very poor educational background on paper but my current thirst for knowledge proves that anyone can "learn to learn". I'm a relatively new tester and beginning any new career means that the learning curve is going to be extremely steep!

    In this talk, I would like to take you on a journey through my own experiences and walk you through some of the failures and successes, that I've had on my path, to become the tester I am today and my own vision for where I'd like to see myself.

    Takeaways: I want people to know that failure is not a bad thing, it's how we learn and grow as people and as Testers. Learning is a very personal thing, there are a number of different methods that I use or have used in the past, in order to find the right fit for me.

    I want people to be more vocal about the things that they learn and not to be afraid to share these experiences with others - there will always be someone that finds value in what you share.

    Above all, I want people to have fun and don't get bogged down with the pressure of trying to learn everything there is to learn about anything, in one day. We need to take a step back sometimes and celebrate the things that we have learnt so far.

    Danny Dainton
    Danny dainton Danny is a former British Army Infantry Soldier who has served in places such as Iraq and Afghanistan but now finds himself completely in love with the Software Testing craft. He's relatively new to Testing but enjoys the challenge of learning new things every day! He's currently a Senior Tester at the awesome NewVoiceMedia based down in Basingstoke but is very much a remote worker living in Bristol. You can reach him on Twitter @dannydainton and he also blogs at dannydainton.com

    We all test in different ways and sometimes it can be hard to explain the thought processes behind how we test. What leads us into trying certain things and how do we draw conclusions, surely there is more going on here than intuition and luck? After working in games testing for almost a decade, I will draw from my personal experience to explain how games testers develop advanced logical reasoning skills. Using practical examples that will make you think, I will demonstrate logical patterns, rules and concepts that can help all of us gain a deeper understanding of what is actually happening in our minds when we test.

    Takeaways: See how testing looks and feels from the perspective of a games tester hear about some of the challenges games testers face.

    Learn about the differences between Deductive, Inductive and Abductive reasoning along with the theory of Falsificationism.

    Identify some of the biases we encounter when using personal observations and how logical reasoning can be applied when testing.

    Rosie Hamilton
    Rosiehamilton Hello I'm Rosie. I've been testing since 2005 and survived 9 years in the UK Games Industry working at places like Xbox Cert, 2K Games & Blizzard Entertainment. In 2014 I left games testing behind and stepped into the world of testing business software. Around this time I started writing a testing blog called Mega Ultra Super Happy Software Testing Fun Time to keep track of all the new things I was learning. I've spoken at and been involved with Newcastle Upon Tyne Agile Testing, Agile North East, South West Test and Leeds Testing Atelier. I am also a host for #TuesdayNightTesting gatherings. My hobbies include violin and yoga.

    Nobody /really/ likes change, its human nature. Testers have a special relationship with changing tools and techniques, they change and we tend to flounder a little and end up very nervous about our place in the new world. Continuous delivery is one such circumstance, I see and speak to many testers really struggling. However, with a significant shift in outlook and a chunk of personal development, testers can excel in environments such as these. It’s time to start to get out in front of a changing world, rather than always battling to catch up.

    I want to share my experience of adding value as a tester in a continuous delivery environment, what new technologies and techniques I've learned, using your Production environment as an oracle, advocating testability and most crucially, not overestimating what our testing can achieve. Testing is not the only form of feedback, it’s time to let go of some the aspects of testing we cling to.

    Continuous delivery adds richness and variety to our role as testers. To me, it is a facilitator for the autonomy and respect that testers have craved for a long time, so let’s get involved...

    Takeaways:

    • My learning and experience of working in a low batch size, high flow environment and the effects that it had on me as a tester.
    • Aspects of testing which hinder ones effectiveness in this environment and how to let go of them.
    • A few practical ways to add value, based on skills that I gathered along the way...

    Ash Winter
    Ash Ash Winter is a learning tester, conference speaker, unashamed internal critic, with an eye for an untested assumption or claim. Veteran of various roles encompassing testing, performance engineering and automation. As a team member delivering mobile apps and web services or a leader of teams and change. He helps teams think about testing problems, asking questions and coaching when invited.
    Micro Sponsors: