Monday, March 17, 2014


About a year ago I saw a presentation from a group of telco business analysts. One particular slide caught my eye which contained a classic statement “this seems to be a complex problem, we need to design a complex model for it”.  Right there and then the system engineering part inside me got a heart attack.  Recently I have seen LinkedIn blogs on “We will solve your complex problems”, or  “This is amazing software to solve your complex data problems “ etc. So I visited the companies, downloaded the software and read all the books, articles and comments. From a practical perspective I look back and I have the following questions and comments as a practitioner trying to optimize organization assets using data science as a core toolbox:

1. Why would you spend at least 10 000 euro’s per year on software that draws nice pictures but can be modeled through something like PostGIS/Postgre/MySQL and R?
2. Do we define solving complex problems as having the ability to process a MxN matrix with thousands of variables and find correlations from it ?
3. How can a piece of software help you to solve “complex” if you don’t understand what “complex” really means?
4. Is the saying true “a fool with a tool is still a fool?”
5. Shouldn’t software provide you with the ability to understand and solve “complex” problems in an open environment which is fed by global intelligence rather than single company intelligence?
6. Shouldn’t you start with the basic skills and understanding of data before trusting software solutions?
7. How will you know the software produces correct answers?
8. Can you interpret the results from these “magic tools” ?
9. If you can do 7 and 8, why do you need software that “does everything for you?”
10. If the vendor had a magic bullet then why do they sell software and not change the world with their own competitive weapons instead?

So when we deal with “Complex problems”, “Complex Solutions” and “Complexity” then I think one should master the following principles as practised in complexity management:
11. Have a standard work approach for data science so that you can understand how to select, prepare, analyse, model and visualise data.
12. Think in terms of multidimensional data-sets.
13. Expand your multidimensional thinking to include spatial data.
14. Expand spatial data to include geographical data, images and dynamic object movements.
15. Think spatial data as how it changes over time – temporal insight.
16. Multidisciplinary insight rather than functional insight solves complex problems.
17. A large MxN matrix with thousands of variables do not capture or solve “complex” – rather it reflects on the tool jockey not understanding insight into the problem required to be solved.
18. Kill MS Excel ideas in the data science domain.

This means that if we borrow from Bioinformatics and gene mapping analytics, we can use that knowledge to build data solutions which can assist in the identification of fraud patterns inside a large ERP system. Some of my favorite “business fractal images” shows how these multidimensional data sets can be used to find optimal trade densities from ERP and geospatial data – definitely not possible through a propriety system – but possible through multiple open-source collaboration efforts.

1 comment:

  1. This comment has been removed by a blog administrator.


Popular Posts