First published in the Proceedings of TRIZCON2001, The Altshuller Institute, March 2001.
Graham Rawlinson, Principal Consultant, Next Step Associates
Graham@dagr.demon.co.uk - www.dagr.demon.co.uk
Phone +44 (0) 1252 330121
NSA, 12 St Peters Park, Aldershot, Hampshire, GU11 3AU, UK.
Although the five TRIZ core concepts of Functionality, Resources, Ideality, Contradictions and Trends should be applicable to software design, in practice a lot of software issues are really an extension of: "we know how to do it, we simply would like to do it (create the software) faster" or, "can we find a smart algorithm which would simplify our coding and operations?"
At the moment there seems to be no sure ways of routing back from complex/chaotic data to simplified algorithmic codes.
So we have been using TRIZ to assist in the clarification of
So, the conclusion is, TRIZ is useful, but not often mind blowing in the solutions derived.
I think the main reason for this is that the design constraints have usually been for 100% reliability and then the goal is simply increase in speed.
If, when the speed goes up, the reliability goes down, then the usual answer is wait for a faster processor. It will not take long to arrive!
Most recently, however, (see New Scientist 3rd March, 2001) an article entitled “Burn Out,” by Adrian Cho, suggests more complex interaction of goals may emerge. This gives TRIZ a good opportunity to show more of the power of its tools.
His comparison of the computer to a racing car seems apt. Not bothered about wasting power, "it is built for pure speed."
"Power hasn't been a concern in the design of processors, but people are now finding it's the primary constraint".
So, how do we look at these issues with TRIZ?
Resources - we simply list the components in the system, as hard or soft components. That is, we can make a model of the hardware, the software functionality structure or both.
Now it gets interesting. With the emergence of power consumption competing with speed we want to broaden our range of parameters (or attributes if you wish) of these components. The interesting parameters are now:
Size - how big are the data packets.
Shape - we can code, in a virtual world, any shapes we like, from a line to a flat plane, to a curved plane, to a sphere or rhomboid.
Position - by mapping onto complex 1, 2, 3 or more dimensions we can in hardware and/or in software, code information in relation to position.
Time - when things are where starts to get important. We can codify data which "should" be moved when it is convenient, or now, or when some criteria reach a certain critical state.
Connectivity - data processing is interconnected, so it is important to handle the feature of the interconnectivity of the data efficiently.
When and where things should be done depends on when and where other things are done.
Now we have the interesting situation where the energy use versus speed contradiction leads us to examine many of the complex interactions in parameters above. How will positional, shape, time and size parameters interrelate? Moving onto the concept of Ideality, how do we use the natural interrelatedness of some of these parameters to trim operations from the system?
Then we can examine the contradictions of speed, reliability, energy used, complexity of device etc.
Many of the 40 Principles do not directly map onto our software system as this system has virtual components as well as physical. So if the Contradictions Matrix suggests phase transition we may need to do some lateral thinking about what this might mean in a virtual world. The key value in phase transition is the way you are changing how much the elements of the system are locked together. So for software code, one might suggest that the parts are made more separable and flow more readily. In the New Scientist article it is interesting that one feature suggested for change is the local temperature of parts of the system, some parts being more dynamically ready for change than others. Combine this with the Principles of Self-Service, Asymmetry, Local Quality and you have a playground where you can play with the virtual and the real worlds and see how they might be made to interrelate in a different way.
Probably the most inviting area will then be trends. How multidimensional is our design? How uniform are our operations in each of the dimensions and for how many of the parameters? This will then help us create designs which are fundamentally novel.
I think the power versus speed issue, which has been around for a long time but has not been critical, offers a chance for us all to explore use of the TRIZ tools for software contradictions in a more challenging way.
I do hope others are willing to share in the stories of developments they have worked with.
About the author: Graham Rawlinson is the principal consultant with Next Step Associates and co-author of "How to Invent Almost Anything" - an easy introduction to the art and science of innovation and contributor to "Creative Education, Educating a nation of innovators".