Today, it is entirely abstract to imagine deploying fully functional hardware or software systems without undergoing an extensive and thorough design process. System design is the process of conceptualizing, defining, and describing the various modules, components, and units of a new system or product. A comprehensive system design process will outline everything about a system, from the components and firmware to the human-software interfaces. As technological innovation has grown, systems design engineering has become a field of study in its own right. System design is now standardized into a formal discipline of study and system engineers play an integral part in most engineering organizations.
The system development process describes the flow of activities and stages involved in the development of a system or product, from conceptualization to commercialization. There are two major system development methodologies:
For a long time, systems and products were mostly developed in a sequential order. From feasibility studies to implementation to testing. The steps were often followed in a strict order, leaving product testing until the end of development. This is known as the waterfall development process. It describes an unwavering cascade of events with high rigidity and little room for rapid testing.
Due to the waterfall process, many organizations would start product development and never see them to the end, because some errors and bugs were too complicated and practically unfixable by the time they were detected. Also, it left little room for continuous innovation and concept improvement.
Today, the waterfall development process can be useful for idealistic projects with little to no complexity. For intricate and massive systems, however, it becomes infeasible to develop products without iteratively checking for success with smaller tasks and expanding from there.
A few decades ago, systems engineers came up with a better approach known as the Agile Development Process, also called The Shift Left Movement, where testing is done repeatedly and with small units early in the development process. This way, bugs are caught earlier, improvements are made on time, and budgets are less likely to run over.
When done well, it allows engineers to iteratively de-risk product development by starting with the riskiest part of development when the cost to change is low, the ability to change is high, and the level of investment is still low.
As opposed to the waterfall development process where most forms of testing are delayed until the end, following the agile methodology, shifting left, and testing earlier has the following benefits:
Generally, products are composed of multiple systems that need to be designed, tested and validated before launch. For simplicity, we will group all systems into two broad categories: hardware and software. NOTE: for the purposes of this post, we include firmware development as part of software instead of hardware. This typically varies from company to company.
The product design process starts out the same for all system components with the product concept and feasibility studies, but it branches out to fit the design processes needs of each system. It then re-merges back together with the launch of a fully functioning prototype that is iteratively improved until it is ready to enter the product launch process.
The product concept is the first sign of life for an intended product. It is simply the idea or innovation you want to develop. It includes the goal of the new product, a general overview of what it is, and an analysis of the user problems it intends to solve.
In some instances, it can also include things such as a rough sketch or mind-draft of the intended design, market sizing, competitor and positioning analysis, and a design analysis.
More popularly known as the feasibility study, a feasibility analysis outlines the user needs or problems to be solved by the intended product in more detail. Its main goal is to determine where the key areas of uncertainty are and whether the user needs can be met in a cost effective way.
For complicated systems such as those found in automotive, aerospace and defense, and energy sectors, feasibility studies and uncertainty analyses are done by creating low-fidelity virtual models of the systems that need to be created. This will display the inner workings of the system and show the interfaces that need to be developed between the different systems. When the model is ready, it will be simulated over hundreds or thousands of runs to vary the parameters for which test data is not yet available.
The system design documentation will then be updated with the results including the chances of success, the acceptance criteria, and key areas of risk. Depending on the technology readiness level (TRL) of the system under development, the criteria to move into the implementation stage will vary. For example, a commercial aircraft would likely need to show that there is more than a 99% chance of success from these simulations, but a hypersonic aircraft for which there is very little historical data available might only need to show 50% success.
As noted above, in this blog we will group the system design processes into:
Hardware systems are the physical parts of a product. They contain live components and active parts that move through space over time. Hardware development requires considering a range of material, electrical and mechanical parts, which are all accounted for in the design process.
Software systems are the collection of instructions, procedures and documentation that perform different tasks on the computer system. Software consists of embedded software (also known as firmware) and application software. Embedded software provides the low level control of a device's specific hardware. Embedded software design involves creating the basic functions of a device and providing services to higher level of abstraction software such as operating systems and applications. Application software performs a specific function directly for an end user or in some cases, another application. Application software design involves the creation of software applications as part of a larger system to satisfy the needs of the end user.
There are three important components in hardware systems that need to be designed:
Contrary to general assumptions, the hardware design process is actually complex and intricate. So much so that there is a common saying among hardware engineers that "hardware is hard". This is because physical systems operate in the real world where the permutations that need to be taken into account are endless. As a result, the hardware design process is never linear and many hardware engineers will follow a standard process of iterative de-risking.
Hardware engineers will typically start with the virtual model created in the feasibility study and increase the level of fidelity of the simulation specifically for the riskiest part of the system. They will ensure that they raise the technology readiness level (TRL) of the system for that part before moving on to the less risky parts of the system. For the hardware components that cannot be simulated, they will develop prototypes quickly using 3D printing technology, breadboards, and simple off the shelf components to develop a working prototype that can then be tested.
This process of iterative de-risking and shifting left allows hardware engineers to gain insights faster and solve the highest areas of uncertainty early on. Thus reducing the risk of schedule overruns and last minute surprises.
The board specification is a detailed process that provides information on all components of the central circuit board which controls all electrical and data transmission activities. It specifies the exact type of printed circuit board to be fabricated with details of the quantity and ratings of components such as transistors, capacitors, resistors, transmission lines, pins, memory chips, and others. This step is extremely important because a single miscalculation or false determination of component type of rating could render the entire product unworkable.
A schematic diagram paints a clear picture of the interconnectivity of components on the circuit board. It's a functional diagram consisting of electrical symbols and lines that show the flow of signals from one point to another on the board. Generally carried out using electronics CAD and simulation software, the schematic stage in digital system design is necessary for analyzing data flow paths, opening up new opportunities for faster transmission, and identifying possible board redundancies.
The board layout is often confused with the schematic diagram, and even though they are graphical representations, they serve very different purposes in hardware development. The board layout is a drawing of the physical implementation of the schematic diagram, not necessarily focused on showing interconnectivity but on the physical placements of the components. A board layout is essential for planning the position of the components and making early adjustments and re-configurations.
This is the process of creating or building the printed circuit board. The PCB design is transcribed from the layout and schematic onto a real board. The components are laminated onto the board by careful soldering and as the shift left methodology dictates, they are they are tested for conductivity on-the-go. PCBs come in different forms: single layer, multilayer, flex, rigid, and rigid-flex. These board are very sensitive and easily damaged, so extra care is taken during the soldering process to avoid damaging the pathways.
The PCB may be part of a larger board and thus, after its fabrication, all other components of the main controlling board are carefully placed in layout order and prepared for the final design assembly.
Mechanical components include all external housing components of the product including buttons, casing, component covers, handles, wheels, and other static or moving non-electrical parts. A big decision point here is to decide if the mechanical parts will be purchased or custom-built by the internal teams. Assuming off-the-shelf already-made parts are chosen, then teams can skip directly to the mechanical assembly step.
The mechanical specifications are created and imported into CAD software to model all the measurements, scales, and necessary elements of the mechanical components. The CAD design is a high technical stage and takes up a lot of time in the mechanical design stage. However, designers have to get it right to avoid complications in further stages. Once complete, the CAD files are forwarded to fabricators or mechanical engineers to produce the prototypes of the mechanical components.
The fabricators or mechanical engineers with either use 3D-printers or industrial milling machinery to build the physical components based on the CAD specifications. Companies will typically use 3D printers in the early stages of the development cycle because they are quick, easy to build, and very flexible. In later stages of development, they will typically use injection molded parts as they are more cost effective for mass production.
This is often a moment of truth for both CAD designers and fabricators. Will the components fit together as the mechanical casings and covers are assembled? Will the assembly process be easy, straightforward and dependable? Will the final assembly be easy to spot check? During this design assembly stage, engineers assemble the mechanical components, often analyzing the process intently with a keen eye towards improvement.
This is where every other component or heavy-duty material is listed for the upcoming stages. All electrical and mechanical components to be used in the design, specifically with their ratings, sizes, quantities and even preferred supplier brands are specified in this stage. A Bill of Materials is created and used to estimate gross margins of the new product and improvements made where needed to ensure that the new product is going to be profitable.
In this stage, the separate mechanical and electrical units are fully built and tested for functionality, the entire cumulative hardware system is coupled and tested again. It may not be functioning fully or properly at this stage without any controlling firmware/software, but other factors such as connectivity, transmission, motion, material strength and durability are tested.
The software design process produces the embedded software/firmware and applications that are attached to the hardware to carry out the functionality required. In general, most companies follow a derivative of the agile software development methodology. This is an iterative approach to software development that helps teams get value faster and more reliably.
It involves the software engineering team breaking up the larger deliverable into smaller parts. The team will work in small, incremental steps called "sprints" that typically last one to four weeks. The goal of each sprint is to deliver a version of a working product. The next sprint will add more functionality onto the version delivered in the last sprint and this cycle will continue until the product is ready for final release.
All customer requirements and technical specifications of the intended software system are outlined at this stage, in a document known as the Product Requirements Document (PRD). It must be carried out to get a clear and precise overview of what the software engineers will be developing.
The technical aspects of the software are then outlined based on the customer requirements. This translation is typically done by a software engineer, documented alongside the PRD and submitted as a Request for Comments (RFC) before implementation. The software specifications include all the minute details of the codebase, such as the names and responsibilities of the specific sub-systems, programs, units, device drivers, and the details of the interfaces to be used in controlling each sub-unit.
Depending on the type of system, the RFC could also contain things such as compatibility with different operating systems, backward-compatibility with legacy systems, deployment conditions, prerequisite applications for running the software, and frequency of updates.
The software system design architecture is the organization of components and the intended workflow within the system. It shows how each sub-unit interacts with the other to perform various tasks within the system. This architecture also takes the hardware and its components into consideration.
It's often considered the "software system blueprint" as it provides a detailed overview of the technical, operational, and quality assurance components of the system. It also creates an abstraction for possible evolution of the software in the future.
Once the system architecture is outlined, the operating system (OS) is chosen to meet the technical specifications.
Microcontrollers and microprocessors are in charge of specific sub-units the systems and control designated tasks. Peripherals are the internal storage devices and device drivers. They can range from just a few pieces to countless dozens, based on the size and functionality of the system.
During this stage of system design, software engineers will work with hardware engineers to carefully choose the right microprocessor and microcontrollers according to the product specifications. They will typically consider the processing power, speed needs, efficiency targets, required uptime, electrical specifications, among other requirements to make the right choice.
Generally known as the IDE, this is the development platform where engineers will actually code. Engineers will typically choose a programming language, select an IDE, customize it with extra tools and extensions, then connect it to a CI/CD tool for deployment to make the system design process easier.
The firmware is a set of basic computer program and very low-level instructions that control that core behavior of the hardware and its interactions with other higher level software. Firmware is typically written using low level programming languages such as C, C++ and C#. Common IDEs for firmware developers include Eclipse, Geany, emacs and Visual Studio and more.
This is where most of the hours spent in software system design are expended. Engineers will typically write, test, debug and optimize their code at this stage. There are numerous tools that can help including open source libraries, Github code repositories, and low-code/no-code platforms.
Engineers building embedded systems can cut out most of the time and resources spent on this step by using powerful engineering tools and system design software such as Collimator. Collimator allows engineers to create controls or signal processing algorithms in a natural and intuitive way using block diagrams and instantly convert those algorithms into high quality ANSI standard C language code that can then be deployed onto embedded systems.
Verification is the process of ensuring that the code works on the designed system or platform. According to software design principles, verification is just one part of the testing process. An embedded system could be entirely bug-free and still fail to work on the host system. A fullstack application could be devoid of errors and still unable to run on the designated operating systems. Therefore, the code needs to go through an additional testing step: validation.
Validation checks if overall system performance is up to the required specifications. The verification and validation process will typically start within a sandbox testing environment, then expand to a host system. Before moving to the production stage, systems must go through an extensive verification and validation process to ensure the end-to-end system meets the requirements and that the system is functionally safe.
For companies that follow model based design (MBD), in-the-loop simulations are used to conduct verification and validation of the hardware and software systems. MIL, SIL, PIL, HIL and human/driver in the loop simulations are used in sequential order to validate the results of the model. For example, a MIL simulation would run and the recorded results compared to the SIL simulation. If the results are different, the model or requirements are modified before moving on to the next step.
Once the software is finalized and has gone through the appropriate verification and validation processes, the application software is deployed onto the operating system via an executable and the firmware is deployed onto the target hardware which could be a microcontroller, FPGA or even a PLC.
To reduce operational risk and improve the quality of the software even after a system goes into production, many companies choose to continuously integrate and continuously deploy new code. This means that even after production, a team of engineers is monitoring the functioning of the system, determining improvements to make, then building, testing and merging updates.
Engineering tools such as Collimator can be used as part of the CI/CD pipeline to increase the velocity of updates and gain a competitive advantage. The main benefits of using Collimator are that your model remains the one source of truth - even after production is complete - and changes that are made to the model can be instantly pushed and merged to devices or products in the field.
The product launch process is a coordinated effort to bring new products to market. Because there are so many moving parts, this process can introduce new risks not seen in the product at this stage. Thorough planning is especially required particularly for companies that have to meet certification requirements such as DO178 and ISO standards. The four primary stages of verification and validation, starting from the earliest and moving closer to mass production, are Prototype, EVT, DVT and PVT.
Prototypes are supposed to be an "optional" decision within the system design process, but in the real-world, no engineer will mass-produce a product without creating a functional prototype as undetected anomalies could be economically and operationally catastrophic.
Product prototyping is the process of creating a small model or replica of the target product to test the concept, check real-world usability, and overall mass production feasibility. System designers will iterate through multiple prototypes through the development process and use the insights generated for testing, verification, quality assurance, and as the basis for further improvements. Prototyping can be done for each of the system components of the product:
Rapid prototyping is an agile development process that slashes the amount of time spent creating prototypes. It is a fast and generally low-cost method of developing a working version of an intended product or system. When done well, engineering prototypes can be finished in a matter of days depending on the size and complexity of the system. It allows designers and engineers to iteratively mock up interfaces and validate it with customers thereby reducing development risk.
The Engineering Validation Testing (EVT) phase is a process to confirm that all sub-units of the prototype design are working according to requirements. It's a very critical stage in system design and analysis as EVT will be rated "unsuccessful" if at least one functional requirement in the Product Requirement Document (PRD) is unsatisfied.
Points validated during the EVT may include:
If a product fails in the EVT phase, the final specifications can be opened up for modification and improvement.
The Design Validation Testing (DVT) phase is a system design fundamental that aims to confirm the integrity of the design according to specifications and PRD expectations. The prototype is subjected to real physical stress for the designers to have an overview of its tolerance, strength margins, durability, resistance to environmental conditions, and general usability impressions.
DVT tests include activities such as submerging into water to check water proofing integrity, burning in increasing flame intensities, being crushed by heavy molds and exposure to harsh natural conditions. This process allows modifications to be made to the design or material choices if anomalies or deviations are detected.
The Production Validation Testing (PVT) phase is the process of ensuring that the new product is viable for mass production. It's a test of the production line and not necessarily the product itself now. If there's any batching failure, machine misalignments, and other tiny obstacles that may create downtimes during production, they are resolved or replaced before manufacturing begins.
The PVT phase is sometimes overlooked in system design because some engineers automatically assume that if EVT and DVT and great and optimal, PVT has no reason to fail. This assumption happens most commonly when production is outsourced to other companies and is a mistake as the downstream effects could still be costly for the new product development schedule and budget.
Traditionally, this was considered as the final part of the product development process. However, this is no longer thought to be true today. Advancements in IoT, cloud computing and 5G have unlocked numerous opportunities to continue improving systems, delivering more functionality, updates and fixes even after the product has met the original system design specifications and is in production.
The world around us is vastly more complex today than it was just a few years ago. Today, systems around us from toothbrushes to cars are collecting zettabytes of data and streaming that data in real time to manufacturers and users.
Traditional development tools that are desktop-based cannot ingest and process the amounts of data required, let alone do it in a timely manner. Tools that run proprietary languages have missed the boat on Python, the lingua franca of AI, ML and Reinforcement Learning. Running such tools results in waterfall development processes that delay insight generation and result in show-stopping design issues late in the development process leading to cost overruns and delayed launches.
Companies that want to succeed in the long-run and protect their competitive edge must therefore invest in engineering tools designed for the problems of the future. They must use software that is built for a world where:
Collimator provides a unified environment to design, simulate, test, and continuously upgrade embedded controllers in a world where big data and AI/ML are used to improve system design, reduce development risks, and bring products to markets faster. Try Collimator today to:
Book a live demo with our team to get started!