AI Isn’t Phasing Out Software Development Roles. But It Is Changing Development Workflows.

 

We’re living in an era where AI is progressively transforming the nature of work across a multitude of industries. This is particularly true for STEM fields, with software development and computer science roles seeing some of the most significant impacts. The adoption of AI is not only altering day-to-day operations but also reshaping the broader management of computer engineering workflows. For example, a recent GitHub and Wakefield Research survey conducted with 500 U.S.-based developers at enterprise companies revealed that potentially as many as 92% of U.S.-based developers already use AI coding tools, either in or out of work. The majority see coding tools as a benefit to software development, improving code quality and speeding up completion time.

Does that ring true across the day to day experiences of most software developers? What does this growing integration of AI in the software development stage really mean for the larger industry? How is it influencing the way software engineers and developers work and learn? And what are the implications of these changes?

Voice of B2B Daniel Litwin sits down for a one-off interview with John Graham, founder and management advisor at Guildmaster Consulting, to dive into these dynamics and how they’re shaping a software professional’s working world. Litwin and Graham explore the impacts of AI on software development, the shifts in computer engineering workflows, and the changes in the wider STEM ecosystem.

Key points discussed in this interview include:

  • The increasing use of AI and language models such as ChatGPT in software development is enhancing productivity by generating code, but it also presents a double-edged sword.
  • As AI models like ChatGPT become more prevalent, there may be a shift in the role of developers, making them more like “prompt engineers”. They’ll need to learn how to best phrase their prompts and requests to these AI systems to achieve desired results, implying the importance of understanding the underlying mechanisms of AI models.
  • There are mixed opinions on how AI will impact job availability in the field of software development. Some believe AI will increase job availability by making developers more productive, while others worry that it might replace certain roles.
  • AI tools in software development and engineering could alter management dynamics. This could range from setting expectations, managing developers, to shaping relationships between clients and developers.

John Graham, founder and management advisor at Guildmaster Consulting, is a seasoned engineering professional and management strategy consultant who brings a wealth of industry insights to the table. His professional journey includes roles as a software engineer at Lockheed Martin and director of engineering at Simpli.fi.

Follow us on social media for the latest updates in B2B!

Image

Latest

AI costs
QumulusAI Brings Fixed Monthly Pricing to Unpredictable AI Costs in Private LLM Deployment
February 18, 2026

Unpredictable AI costs have become a growing concern for organizations running private LLM platforms. Usage-based pricing models can drive significant swings in monthly expenses as adoption increases. Budgeting becomes difficult when infrastructure spending rises with every new user interaction. Mazda Marvasti, CEO of Amberd, says pricing volatility created challenges as his team expanded its…

Read More
GPU infrastructure
Amberd Moves to the Front of the Line With QumulusAI’s GPU Infrastructure
February 18, 2026

Reliable GPU infrastructure determines how quickly AI companies can execute. Teams developing private LLM platforms depend on consistent high-performance compute. Shared cloud environments often create delays when demand exceeds available capacity Mazda Marvasti, CEO of Amberd, says waiting for GPU capacity did not align with his company’s pace. Amberd required guaranteed availability to support…

Read More
private LLM
QumulusAI Secures Priority GPU Infrastructure Amid AWS Capacity Constraints on Private LLM Development
February 18, 2026

Developing a private large language model(LLM) on AWS can expose infrastructure constraints, particularly around GPU access. For smaller companies, securing consistent access to high-performance computing often proves difficult when competing with larger cloud customers. Mazda Marvasti, CEO of Amberd AI,  encountered these challenges while scaling his company’s AI platform. Because Amberd operates its own…

Read More
custom AI chips
Custom AI Chips Signal Segmentation for AI Teams, While NVIDIA Sets the Performance Ceiling for Cutting-Edge AI
February 18, 2026

Microsoft’s introduction of the Maia 200 adds to a growing list of hyperscaler-developed processors, alongside offerings from AWS and Google. These custom AI chips are largely designed to improve inference efficiency and optimize internal cost structures, though some platforms also support large-scale training. Google’s offering is currently the most mature, with a longer production…

Read More