Tuesday, January 28, 2014

HIL in software test automation toolchain

Introducing HIL in software testing.

Typical toolchain for Test-Driven Development (TDD) in software project may consist of continuous integration framework, static code analysis tool, unit test tool, and possibly acceptance test tool. In case of OSS tools in use, the chain may consists of Jenkins, cpptest, ccpunit, and Robot framework, as an example among many. That's a good toolchain for pure software development, but in case of embedded systems, it's missing the HIL aspect.

Robot Framework project defines itself as a generic test automation framework for acceptance testing and acceptance test-driven development (ATDD). Robot has gained popularity recently within my organization. The framework is written in Python, which seems to be popular language among many of our developers.

Robot is well suitable for testing web user interfaces, communication protocols, database operations, and all such cases when the behavior of the software under testing can be monitored via external interfaces of some sort, without introducing any test instrumentation into the software itself. In acceptance testing, the intention is to test the production software the way it is supposed to be used in real use cases, and that's why such an instrumentation like in case of unit testing  is not acceptable.

Robot is keyword-driven, which enables different abstractions levels, as one keyword can be defined from other keywords. In addition to Standard Test Libraries,  there are extensive set of community contributed External Test Libraries for many different purposes.When implementing test cases, one can easily combine keywords from different libraries, without need to do any actual coding. A graphical Robot IDE (RIDE) is available to make it easy to write and maintain test case sets.

Robot supports Remote Server concept similar to gdb-server.With help of Remote Server, Robot Framework may communicate with test libraries located in a different machine, or written with different language. It enables distributed testing as well. Remote server uses XML-RPC over HTTP to communicate between the Framework and Server.

Robot Framework Remote Server architecture.
Readily available Remote Servers exists for Python, Ruby, Java, .NET, Perl and more. As the XML-RPC is well documented, one can easily implement servers for other languages as well. There even exists a lightweight XML-RPC server implementation for Arduino.

Robot Remote Server concept sounds like the solution to integrate software testing with other domains, especially hardware testing, as discusses in my previous posting. National Instruments LabView is possibly the most popular solution for hardware testing. Implementing XML-RPC Server in LabView and integrating it with Test Library in LabView makes it possible to communicate with hardware test routines from software test cases.

Disclaimer:
Personally I'm not a big fan of XML. XML-RPC consumes 4 times more characters compared to plain XML, which by itself is a bloat compared to JSON for example. XML-RPC is fixed to be transferred over HTTP, which is not the most efficient method for two-way communication. JSON is not tied to any specific carrier protocol, WebSocket is possibly the most common and obvious choice. WebSocket by itself is more efficient for back and forth data transfer than HTTP. I hope one day someone will introduce JSON/WebSocket implementation to substitute XML-RPC.

Thursday, January 23, 2014

Software development with Hardware In the Loop

Hardware In the Loop (HIL) Simulation has been used in development of complex embedded systems. However, it does not need to be limited to complex systems only but development of simple embedded systems may benefit from the approach as well.

In traditional HIL Simulation, the physical system "plant" is modeled mathematically "plant model". The system under development (SUT) then communicates with the model instead of the real physical system, which may be expensive, dangerous or hard to reach during development time.

The challenge in HIL simulation is the effort and time required (costs) to create and maintain a model that simulates the real world physical process with any degree of accuracy. Because of that, HIL has traditionally been limited to most complex and safety critical systems only, like automotive and aviation domains.

HIL is not only for verification on complete system, but it can be utilized already in early phase of the hardware/software co-design process. In pure software development, methods like unit testing and integration testing have been in use for long. Why not to introduce Hardware In the Loop already in the unit test phase of the embedded software development? Then the whole plant model does not need to exists, only tiny parts of it, and over the time the model gets more and more complete piece by piece.

The Wikipedia says: "An HIL simulation must include electrical emulation of sensors and actuators. These electrical emulations act as the interface between the plant simulation and the embedded system under test."  
HIL Interfacing
In the illustration above, I'm referring to sensors as Stimulus, and actuators as Response. I prefer those expression, as we may introduce HIL already when the target hardware does not exists. Then we not only simulate the plant model outside of the SUT via sensors and actuators, but also simulate the missing hardware subsystems inside the SUT. Software development and HIL testing can begin with plain CPU/MCU evaluation board only, and there is no need to wait for the first prototype of the target hardware to exists.

In order to execute software unit and integration testing in the target hardware or evaluation board, some sort of communication and software instrumentation is needed in the target to enable execution of individual functions upon request and collect feedback. Then we have three feedback cycles as illustrated:
  1. Provide physical stimulus and inspect what actions it causes in the software
  2. Execute software function and inspect the physical response
  3. Let the software run and inspected what kind of physical response is cause by physical stimulus
Now we have full control over the whole process, including physics, electronics, and software phenomena. My suggestion is to start with baby steps, by introducing HIL at earliest phase possible in software development. Then we start gaining savings in expenses, reduced project risks and improved safety, especially by expanding the test coverage.

HIL is essential part of embedded systems regression test automation, and now we're talking about the money!

Edit Jan 28th:
Clarification: What I mean by "talking about the money". I mean that manual regression testing is awfully expensive, and that's the motivation for automated regression testing. 

Tuesday, January 14, 2014

Intel - the walking dead

Referring to Intel's latest Edison release at CES, I mentioned "Perhaps now they have invented something where they can be good at.". After investigating the Quark architecture more, I do not think the same anymore.

The Quark processor is based in Pentium architecture, as Intel states. This means it's eventually based on 32-bit x86 architecture, which dates 40 years back. Why an earth Intel selected Pentium instead Atom as the starting point for Quark? I can only conclude that Intel has reached dead-end in it's effort to decreasing energy consumption of Atom architecture.

Even if Atom cores are advertized energy efficient, the overall energy comsumption of the whole CPU environment including chipset and others is way above acceptable limit of embedded systems. With Quark, Intel advertises they have managed to drop the energy consumption down to 1/10 of Atom cores. Such a tenfold reduction of Atom energy consumption is a can not task. So, something else is needed, and the answer is reversing back to Pentium architecture.

Well, let's rewind the clock back for 20 years. With Pentium architecture you loose all the processor development Intel has done for the last two decades. No 64-bit registers, no out-of-order execution, no MMX, no SIMD, and more. All this means that you can not run latest Microsoft software in the CPU. Time to seek for the floppies to find DOS and Windows 3.1... Linux runs, and that's the only supported OS at the moment.

The concept of integrating the whole CPU environment with connectivity into a SD-card form factor is clever. I can immediately imagine several use-cases for the approach: a system having bare bone functionality implemented with a tiny MCU, and if connectivity and user interface is needed, that can be provided in physical form of SD-card add-on module. You get them both, minimal cost base system, and state-of-the-art Internet of Things support.

The Quark is just a Intel's last effort to struggle against non-existence at embedded market. I'm not convinced, and I hope someone will soon introduce similar concept based on ARM architecture.

Edit Jan 18th:
Intel confirmed it will cut 5000 jobs, which is 5% of its global workforce, due to falling computer sales. As Intel fails to enter into new market segments, simultaneously loosing sales in traditional markets, the company can't stand for long. Google is said to developed it's own processor for data center use, which may indicate Intel loosing that market as well.

IBM and NVidia have released joint effort in creating more energy efficient computing solution for data centers for enterprise and scientific use. At the moment, Intel+NVidia are ruling the Top-10 of Green 500 list, but that may change soon, if Intel doesn't improve. As Intel fails to improve energy efficiency of its embedded CPUs, I'm not optimistic they manage to do so in supercomputers either.

http://www.reuters.com/article/2014/01/17/us-intel-jobs-idUSBREA0G1I420140117

Guido Stepken discusses also about Intel power efficiency in his posting The irrelevance of Microsoft/Intel vs Linux/ARM

Edit Feb 11th:
My dreams have come true! Electric Imp has released an ARM CPU based Wifi-enabled embedded computer in SD-card form factor! It's available at Sparkfun and other distributors. More information and instructions at Sparkfun site:https://learn.sparkfun.com/tutorials/electric-imp-breakout-hookup-guide/all



Thanks to Jan TÃ¥ngring @ ElektronikTidningen for the hint. Here is the original article in ETN: http://www.etn.se/58507 (in Swedish).

AdaFruit has also published an article http://www.adafruit.com/blog/2012/12/06/new-product-electric-imp/

Change in embedded 3D graphics market

Nvidia released it's latest offering in Tegra family at CES fair last week. The K1 CPU will be a game changer in consumer electronics. What makes it so special? At the moment there are only two companies having high-end 3D graphics acceleration assets. Those two companies - AMD and NVidia - are dominating the PC graphics card market at the moment.

What differentiates those two companies from each others is the fact that AMD does not have reasonable offering in embedded CPU sector. AMD is trying to enter into the embedded market with their x86 architecure G-series SOCs, but they are not yet quite there. AMD provides DX11 interface, but they are missing the CUDA architecture. The major benefit of AMD x86 architecture is the fact that it can run full featured Windows operating system. But hey, where are the Windows phones and tables, anyone seen? (*)

The K1 offers all the same interfaces for graphics acceleration that full featured PCs and consoles do, including OpenGL 4.4, DX11 and CUDE 6.0. That means PC/console games should be rather straightforward to port to the new CPU platform. Console level gaming in tablets is just behind the corner. And as the K1 is based on ARM architecture, Android is just a plug and play exercise for device maker.

Currently Qualcomm is dominating the high-end tablet and smart phone market, with almost 30% operational margin. In graphics, Qualcomm does not have such assets, and they are stuck on OpenGL ES level. In Snapdragon processors Qualcom has high performance graphics acceleration integrated, but as it has different graphics architecture which makes porting difficult, I do not expect them to win the high-end gaming market. If you haven't sold your Qualcomm shares yet, perhaps it's a good time to do it now.

Intel is next to non-existing player in the embedded market. I have never really understood what is the justification of existence of Atom processors. They are not good at anything. Now at CES Intel introduced a SD-card sized computer with Quark processor. The concept is called Edison. That's interesting, but definitely aimed at something else that high-performance graphics, wearable computing as Intel says. Perhaps now they have invented something where they can be good at.

At the moment, NVidia does not have a proper modem offering. If they ever will introduce one, or do co-operation with Broadcom, they will be very strong assets for smart phones as well. Broadcom has very strong position in connectivity chipsets at the moment, and as they recently purchased Renessas mobile (ex. Nokia modem division), they do have reasonable LTE-modem technology in hand.

Nvidia is strongly pushing itself into automotive market. Audi, BMW and Tesla are currently using Tegra technology in latest models. At CES, Audi demonstrated a virtual instrument cluster running in K1. The 3D software technology in use comes from Rightware, which provides Kanzi 3D solution with runtime rendering engine and UI development environment.

Rightware has it's headquarter just next to ours in Espoo, and I already have a Kanzi 3D demonstration setup on my desk. In my demo, Kanzi is running in Freescale quad-core i.MX6 CPU with Android. The i.MX6 is a nice chip, but it's limited to OpenGL ES graphics acceleration. That's definitely good enough for 3D visualization in working machines and vechicle environment, but it's not powerful enough for high-end gaming with high 3D graphics processing requirements. Actually, we have recently designed a professional vehicle graphical control panel with i.MX6 CPU that can benefit from 3D visualization technology.



*) In the local elementary school where my daughter is attending her fifth class, they are investing in new technology in teaching. Recently the school purchased tablets for all the students. Today at home my daughter reported that for a certain classroom assignment, half of the class got iPads, and rest of the class got Surfaces. Those lucky ones who got the iPads got the assignment done in half the time compared to the other ones with Surfaces.

Some further readings:
Rightware press release: The New Audi TT Instrument Cluster Created with Rightware Kanzi UI Solution
The Motley Fool: NVidias Tegra K1 Completely changes mobile gaming
CNet: NVidia K1 chip sees the open road
The Verge: Intel announces Edison, a computer the size of an SD card

Friday, January 10, 2014

Embedded trends 2014

It's the time of the year to predict what will happen in the embedded market within the year.

Multicores

Big Things will be multicore architectures and 3D graphics. Of course, in consumer electronics those have existed for years, but I'm looking things from traditional embedded systems point of view. The driving force is the car industry. A modern premium car may have more than 100 CPUs and MCUs in total. What a nightmare from management of software development point of view!

The car industry will put significant effort in development of multicore software development methods and tools within next few years, in order to reduce the number of micros in a car. This will lead into significant challenges when several safety critical functions are combined; how to guarantee performance and isolation of software blocks. Virtualization is one key technology to enable concentration of functions.

In the mobile devices domain - smart phones and tables - major revolution will occur in gamin. New CPU releases like Nvidia Tegra K1 will change the market, by providing PC/console level graphics support in mobile devices (Nice article in AnandTech). That will influence industrial embedded as well, by setting new level of expectations to performance. The car industry is once again the key player in 3D graphics, as display based dashboards (Article in Wired) and infotainment are penetrating from  premium to standard models.

Industrial internet

There is quiet revolution ongoing in industrial business models. Companies no more sell devices to each others, but provide services and get paid according to usage or capacity. Classical example is Rolls Royce no more selling jet turbine engines, but "hot air behind the plane". RR charges customers based on flight hours. In order to maximize up-time and minimize down-time, preventive and remote condition monitoring is needed. RR is said to know better the state of an individual engine in flight than the crew in the cockpit.

Why does RR want to have it like that? That's because there is bigger money in services and spare parts business, than in initial sales of engines. When providing the engine as a service, RR does have full control over the whole supply chain thorough the life-cycle of the engine. The same analogy exists in many other fields of industries; cranes, elevators, generators, compressors, etc. For the paying customer, it is all about financial risk management; initial investments are minimized and you pay only for the usage proportional to your production.

Industrial internet, in terms of Connected Factory, Wireless Sensor Networks, Internet of Things, etc. will be a major enabler of the new industrial business models. Big data is not a driving force, it is a necessity due to the flood of information provided by the other means.


Monday, January 6, 2014

RIO grande railroad

While brainstorming further the idea of model railway control, I received a myRIO box from NI, which opened my mind. 

RIO Architecture

National Instruments provides several RIO product families for various measurement and instrumentation needs. These families are FlexRIO, CompactRIO, Single- board RIO and myRIO. All the RIOs are sharing the same basic architectural concept with CPU, FPGA and IO. The name RIO stands for Reconfigurable I/O.


The FPGA makes RIO solutions good for fast real-time and IO-intensive signal processing. For different purposes, NI provides number of LabView modules that integrates with the FPGA. User does not need to know anything about VHDL or FPGA programming in order to benefit from the performance of the FPGA chip, the LabView takes care of that. With help of LabView FPGA Module, there is also option to program the FPGA directly with VHDL, if needed.

myRIO

Recently, NI introduced myRIO product  for educational purposes. The device has dual-core ARM Cortex-A9 CPU and Xilinx FPGA chip. Other key features include WiFi and USB connectivity, 40 lines of digital I/O, 10 analog inputs, 6 analog outputs, and audio in/out.

NI myRIO
The myRIO CPU runs Linux RTOS, fully supported by LabView graphical programming environment. Reasoning for Linux is presented in this NI whitepaper. In addition to LabView programming, C/C++ can be used, and Eclipse IDE is officially supported by NI as well. However, when implementing program with C/C++, LabView FPGA Module is needed to get access to IO. As it does not make sense to use this specific hardware without using FPGA and IO.

Model railway control

As said, I just received a  myRIO box that I purchased for evaluation purposes. After spending quite a lot time thinking and investigating control solutions for model railway, as told in my previous posting, I rather soon invented that this box is suitable for implementing the safety function of a railway.

With 10 analog inputs, it can monitor presence of a train at 10 blocks of track, by simply measuring voltage over series shunt resistor. 6 PWM outputs or 6 analog output makes it possibly to drive DC-motors at 6 blocks, enabling speed ramp-up and ramp-down functions. Rest of the IO available can be used for train position sensors, switch control, and signals (semaphores).  USB host interface can be used for connecting RF modules for wireless communication with individual trains.

WiFi is convenient way to connect the myRIO box with a computer running a scheduler software for the operations of the railway, meanwhile the safety functions implemented in the myRIO prevents any accidents to occur.  For example, if a train has reserved a certain block or route, the RIO refuces to turn a switch on that reserved route, even if requested by the scheduler program. Or if a train is about to drive towards a red signal, the safety function can prevent that by switching power off from the section of track.

 LabVIEW for automation control

Using LabView to control a model railway is not unique. I found a scientific paper describing a similar setup. In that arrangement a PCI IO-card was used, and all control logic was running in control PC with LabView. In my scenario, the control is divided in a way that all the safety functions are implemented in RIO, and PC provides user interface and train scheduler.

Open source software  Rocrail is a good candidate for the higher level control and UI, as it provides well documented and open interfaces for control units. Of course, it is possible to implement UI with help of LabView as well, but why to reinvent the wheel?

Why then to use the LabView at all? First of all, all the existing model train control software are based on capabilities of commonly available COTS control systems.  What I'm looking here is something more, and readily available solutions do not exists to fully meet the requirements. Secondly, I believe LabView provides good productivity for all kinds of automation and control designs, model railways included.

Edit Jan. 19th:
Frankly speaking, myRIO is totally overkill for the application I described here. RIO architecture is good for fast signals. In the model railway, signals are rather slow in that sense.

I just found Arduino extension to Labview.  That looks like an ideal solution. First of all, Arduinos are inexpensive. One can connect several Arduinos simultaneously, and have scalability in system design. Arduino with LabView connectivity is fast enough for model railway control purposes.

Arduino has 6 ADC channels, 6 PWM outputs, and in total 13 DIO available. Simple railway layout can be constructed with only that. And more Arduinos can be connected if necessary.