The history of source code analysis is – read on why – a sad one. And as so often, if something turned out be writing sad history, it is time to look ahead and get over the past. Turning software quality analysis from a blame game into a game of Thrones for bugbashing to drive the success and adoption of the software is at the core of this “next generation” code analysis movement that we see sparking up everywhere. Thanks to the accolades the Source Code Tool market got after the recent acquisition of GitHub, it may be the right time to write about another branch of source code tools.
Source Code Analysis 1.0 – The Rise of A.I. …. eh, no. Q.A.
Ever since people write software, they do so on computers – ignoring some paper coders from the Moonlanding era. Surprisingly, code reviews also started more on printed paper. But as usual, people started to write code to verify and look at their code. And as those tools spread, people bundled them into software trying to commercialize it. Generating the first code analysis tools.
The power of that day being more focused on business value of the algorithm and not so much on user experience, those tools were terrible to install and written in terribly ugly front-end frameworks that still look the same for some players today. Stuff even worse that Java AWT and mostly filled with textboxes, comboboxes and at best some very random and minimal chart drawing pane. Terrible.
Because of all this, it required skill and training to implement, run and work with these tools. And with the fad of the day being automotive-inspired TQM spilling over to software development in the V and Waterfall model, you got yourself a bunch of Testers and QA guys under the roof.
That market still exists today and almost no company operates without dedicated software testers. Universities teach it. ISO and compliance frameworks require them. And the tools that be are still out there.
The only downside of all that is manifold. First of all, that area is so boring that normal developers, architects and enginers do not want to do it. Which gives you a lemon market for the team and which further strengthens the big chunky wall between developers and testers that started to spill upward into how the compliance, quality and risk teams around development formed their own monolithic blocks. No MSA or agile.
And of course, that organizational complexity only added to the distachments of those people running development organizations and managing them to advance their companies. Adding to the overall disconnect of what is going on at the bottom of hierarchy where the smart magic happens in product design, development, testing.
This is kind of why we saw the next iteration coming out of nowhere. Why? It is only natural that the level of control over the results of work by those that actually have to do the work – the developers and engineers – kind of got lost as more wards were added to monitor the prisoners and more processes were added to keep control of a system that the leading people at the top long lost track of to control directly. If the process-overhead and the sheer number of wards and cooks kills the performance, hacking starts to start producing software around the failing system.
The DevOps stage – Or “All power to the Engineering Rockstars”
DevOps is a massively abused word and probably there isn’t a single unifying definition of it out there. But let’s for now say that the DevOps movement is something that can be understood by looking at Developers slowly taking over the entire operations stack around them and replacing processes, wards and cooks with stuff that is less targeted on general best practice process installments but more on the actual job. DevOps created a lot of tools adding development, testing, then moved into infrastructure automation and orchestration replacing the SysOps, etc.
More interesting is what happened with this movement and in which context it happened. The context is clear. We could call that global agile. Thanks to the open source movement, hackatons, improved communication and community building infrastructure, the top engineers started to de-couple their work from their actual jobs to some extend and started forming loosely coupled, short-lived, ungoverned networks learning from each other, interacting not in going concern corporations but in project teams, variable commitment open source projects, sprints, meet-ups. Showing a way to develop software without all the formal structures and processes. And, which is the interesting point, not introducing a shisma between QA and Development and not having a disconnected manager or C-Level ward looking over them.
This entire thing started to be adopted by some tech companies and proliferated the emergence of all the agile and scrum and whatever hip name you find organizational model you find in modern IT organizations.
The big result of this was: the QAs got thrown out. The C-Level guys understood even less what is going on and lost all their process-based control systems over the entire process.
It somehow works a bit better and is lauded by everyone, but the disconnect between managers and developers creates suspicion and puts the powers into the magic modern software engineers hand. While still leaving all the risk and responsibility for execution and delivery to the manager that now works around the disconnect.
Leading to yet another phase that we see emerging recently.
Solving the Disconnect – C-Level Solutions on the rise
Only natural, as software is permeating everything, and impacts become more of a public agenda issue and companys ability to survive or retain a decent share price is impacted by software failures, the C-suite is thinking about how to get a bit more control over the entire process.
There is one good thing in this DevOps world happening that supports this movement. DevOps also is defined by the “programmability of everything”. And every instruction and order of instruction in an IT system leaves a trace of logs, snippets, code artefacts, etc.
That means everything magic that is happening is completely measurable. And thanks to our advancement in data ingestion, preparation, visualization on a very big data scale, and machine learning based processing of massive data chunks becomes more and more an “easy to operate in” domain, tool and platform makers are starting to emerge that not follow the the fBbB paradigm (for developers, by developers), but focus on fMbB (for managers, by developers). In other words, some developers who do not like their gained power and freedom are snitching against their peers and work on the problem of how to measure and monitor what is going on and make it accessible and readable for the C-Suite. Of course, for the benefit of mankind. Because loosely coupled swarms of developers don’t always perform best on the quality assurance and software safety KPIs and that is both relevant for (a) the success of companies that create products and the jobs of the developers that create these products, and (b) users and consumers of the software.
The only issue here is really this: no matter how you twist and turn it, you are snitching against developers and you are merely trying to re-instate an old system of authorative control. Something that already failed in the past. And that is why after this short lived fed that will likely die off soon enough, there has to be this participatory approach that doesn’t reinstall old control mechanisms, but that unites the capabilities and interests of both the C-Suite (The job security suite) and the Quality Assurance Suite (The no dead consumers suite). Which brings us to the new kids on the block.
Software Analysis 2.0 – Code-centric Multi-Sided Platforms for managing development
At Acellere, we see us clearly in this spot. We understand that the old QA model has its validity, but we dislike the rivalry between QA teams and developers. For us, the trend is going towards solving the QA challenge by automation and augmentation of the development experience to merge QA and development. That is a time spent on QA issue that automation and augmented solutions can offer.
We also see that development teams, no matter how smart and savvy they are, likely aren’t fully aware or necessarily interested in the side of the job that is focused on cunsumer safety, competitive product delivery, feature and capability to market and project survival and the fundraising side that is required to maintain the resource-sufficienty to drive and steer a software product. So we clearly see that the DevOps paradigm is also a failing paradigm.
There is a reason why a whole organization – be it a business, a funded open source project or a non-regulated organization based on crypto and smart contracts – still provide far better products and perform better on containing externalities of the products existance – by attaching governance and complianc against reasonable standards and laws – and should continue to exist.
A whole organization is an organization that combines the key stakeholders that form such a whole organization without falling into an old organizational design habit – that of authoritative control and sticky process driven governance. The natural result that can be the only one solving this challenge is a well-designed platform that connects the stakeholders in a way that gives the right singals and information, supports the joint and participatory collaboration on solving the shared goals of the organization and puts emphasis on sufficient compliance against requirements – beyond technical ones – without allowing any sort of blame game to be played. The collaboration must be educational, transparency forming, unifying, and aligning in a collaborative setting.
Hence the next generation of product will be multi-sided platforms. Multi-sided platforms around the entire development process that aggregate and prepare artefacts generated in the process and make them readable and digestible to the stakeholders.
The fact that source code analysis is a good starting point for this is simple: the commit to a branch level of source code is perfectly balancing the frequency of information – single code item changes by 10.000 developers per second is simply cognitive overload – and waiting for shipment and integration tests and runtime performance tests is too slow.
That is why it is static code analysis 2.0. And not software development 2.0. And the trick is to gamify or simply to make the process engaging and impactful as to shape the way we write software: meaningful software, in short delivery cycles, with levels of quality and security the world requires. It’s that simple.
Source Code Analysis is a thing of the past. It’s the QA era. The QA era failed. DevOps emancipated engineers from the QA era. But it creates consistent issues in quality that have externalities that are hard to control. It’s a volcano waiting to blow. C-Level solutions are emerging to make externalities managable again. But they revert back to an authoritative control system regime that won’t fly. Next generation will be source code based due to frequency of feedback loops, but in essence is just the entering step into what comes after DevOps. Let’s call it DevOrgs. Or whatever.