Version Control Systems Borrow from Software Playbook
Latest News
June 1, 2018
Fast and furious is the modus operandi for most software engineering teams, which rely on open, web-based version control platforms to easily share code and empower innovation. Compare this freewheeling approach to traditional engineering, where design teams are constrained by rigorous processes that lock down work in progress and set limits on agile iteration.
With complexity on the rise and embedded software monopolizing product real estate, the question becomes whether distributed version control concepts popularized in the software world could now influence traditional engineering. As it turns out, the answer is yes. In a show of what’s possible, new cloud-based tools and traditional product data management (PDM) and product lifecycle management (PLM) platforms are starting to borrow concepts from distributed version control systems (DVCS) and establish a middle ground.
“The connection between agile engineering paradigms and more traditional staged-gate processes has to align somehow,” says Christoph Braeuchle, senior director of product management, for PTC’s Integrity Lifecycle Manager and ThingWorx Connected Requirements and Validation applications.
Although there is opportunity for alignment, Braeuchle acknowledges the inherent differences between the engineering disciplines and the version control workflows required to support them. Software engineers, for example, need a local work environment with the ability to debug code and understand the flow of software, he explains. Also, most software code is easily compressed, which makes it less onerous to store and distribute compared to detailed 3D models.
This is a different scenario from mechanical engineering, where prototyping is expensive and time-consuming, involving more than sending files across the transom or running automated software builds. “Nevertheless, at some point, embedded software and mechanical hardware designs have to fit together and make for a well-connected product,” he says. “That means the agile DevOps paradigm and the sequential V model have to fit together.”
Design Iteration on Steroids
The danger in sticking with the status quo is an inability to iterate designs and respond quickly to changing market conditions, experts say. Adopting software-inspired distribution and modern version control tools gets traditional engineering closer to the agile development mindset, with the potential for getting a jump on the competition, notes Cody Armstrong, technical services manager for Onshape, a CAD system designed as a cloud-based platform.“Ultimately, it’s all about how quickly you can iterate and get product to market faster,” he explains. “Inevitably, if the competition is getting prototypes out the door faster or products into market before you, that’s the motivation to try something new.”
Onshape, started by a team of CAD veterans and SOLIDWORKS founders, is lauded by industry players as taking the biggest leap to bridge the version control worlds of software and traditional engineering (see DE’s review on page 34). Take the concept of branching and merging, popularized by Git and other DVCS used extensively in software development. Branching is a mechanism for isolating work so individuals can iterate on their own in parallel, then merge their changes in a way that preserves the integrity of the design. Traditional CAD and PDM have had no such mechanisms in place for managing divergent design paths as part of a centralized repository and version control system.
Onshape, on the other hand, lets engineers build variations and experiment with new ideas using branches that will not impact other design collaborators or the original design; when officially “merged,” the various iterations are represented in the final result, essentially combining many ideas into one. Onshape’s approach is possible, Armstrong says, because it is built on a modern database-driven architecture, unlike traditional file-based CAD tools, which handle the practice by overwriting or replacing one file with another.
The latest Onshape release further connects the hardware/software version control workflow gap by fully integrating its Release Management and Approval Workflows feature into the base platform as opposed to being a separate application that has to be configured independently, Armstrong says. As part of this capability, traditional check-in and checkout processes standard in older PDM platforms no longer apply. This allows collaborators to have instant access to a design without having to check out or update a version. Additionally, the new Onshape lets released designs be referenced anywhere, using the Insert dialog filter to track history while ensuring that a change isn’t mistakenly inserted before formal release.
“We want users to be able to release designs and manage approvals, but we never want to block them from creating something that is inspiring,” Armstrong says.
Autodesk, which came late to the PLM market with a built-from-the-ground-up cloud-based platform, also specified version control as a core capability of its Fusion Lifecycle platform unlike competitors, many of which deliver the functionality as an optional bolt-on, according to Bankim Charegaonkar, senior product manager for Fusion.
He explains Autodesk made that decision early on after discovering many potential PLM customers weren’t using formal version control, but rather relying on manual processes or simple network file server management. In addition, Fusion Lifecycle’s cloud-based architecture allows for easy extended collaboration—one of the core benefits of distributed version control, he says.
With a core version control foundation in place, Autodesk began exploring more sophisticated capabilities, including whether it made sense to incorporate elements of software-inspired DVCS, he says. The company launched a preview, which showcased a form of branching and merging to allow engineers to isolate work so they could riff on a design without impacting collaborators; however, the team found it didn’t go far enough and created additional problems when trying to merge changes. Today, Autodesk is looking at intelligent automation as a possible solution to the challenge.
“The real problem is continuous integration and delivery—you need a way to have all the changes be integrated all the time while validating that the changes are not breaking something else,” Charegaonkar says. “The longer someone is in a space of isolation, the more divergence we get. Automation can provide feedback if someone made a change that will result in a clash.”
PTC also sees merits in merging the two development disciplines, but doing so through integration and orchestration—not by building new DVCS features into the core Windchill PLM platform, Braeuchle says. PTC offers a software configuration management capability through its Integrity Lifecycle Management platform, and that system interfaces with Windchill along with popular software development tools like the Jenkins continuous integration build server and Git-based solutions, Braeuchle says.
As part of its efforts in this space, PTC is working on a workflow orchestration platform, previewed this month at its LiveWorx conference, which will facilitate information exchanges between different platforms in real time while sending notifications on updates and changes. “This idea of orchestration of a heterogeneous work environment is important because a monolithic, centralized repository is not fit for the agile engineering world,” he says. Using the orchestration capabilities, PLM will serve as the curator of all of the information, presenting it in a system and in a manner that makes sense for a particular role doing the work, he explains.
For Aras, maker of the Innovator open PLM platform, there is nothing about PLM data that counts it out from being handled using distributed version control principles, yet the company has to see a demand among its user base and believes the need will vary depending upon industry, according to Rob McAveney, the firm’s chief architect. “PLM manages a much more heterogeneous set of data than source code control systems, with different data types having different configuration management rules,” he explains.
“PLM also doesn’t generally organize data in bite-size chunks like repositories—rather, it’s a large interconnected network of data with no clear delineation of project and product boundaries.” Formal change management and the large-size of CAD file content would be additional obstacles to making a clean break with traditional version control practices in favor of software-focused DVCS, he says.
It’s not only inevitable, but it’s actually essential that we see more DVCS capabilities make their way into traditional engineering tools, contends Michael Tiller, president of Xogeny, which specializes in web-based engineering analysis tools. Although such capabilities are crucial for collaboration and effective content management, he acknowledges the difficulty blending the two version control worlds and says it will take time getting established engineers comfortable with a new paradigm until there’s a proven track record.
“The problem is not everyone has the luxury to start with a clean slate,” he says. “People aren’t willing to give up the proven aspect of their tools without some really serious benchmarking, and that can make it difficult to move things forward in a big way.”
More Info
ArasSubscribe to our FREE magazine,
FREE email newsletters or both!Latest News
About the Author
Beth StackpoleBeth Stackpole is a contributing editor to Digital Engineering. Send e-mail about this article to [email protected].
Follow DE