Algorithms & Machine Learning Demystified

When I was learning how to troubleshoot and repair electronics in the Navy, I would sometimes challenge one of the instructors on how something worked. If I delved into a complicated subject I was often told it worked on “FM” which meant f***ing magic. That rarely stopped me however, and I often found the concepts were not overly complicated, just not directly relevant to my training.

There is some FM in information security that I’d like to demystify as we examine how tools can enable and not hinder the craft. We’ll examine algorithms and machine learning in this installment.

Algorithm = programming function

The definition of an algorithm is “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” To a computer programmer this is a function or method. These functions can do things as simple as performing arithmetic or as complicated as landing an aircraft on a moving aircraft carrier. When vendors claim they have proprietary algorithms what the indoctrinated hear is “we wrote some custom code.”

With algorithms just being code, they are limited to their usefulness and quality. Complicated code isn’t always better code (it often is the opposite.) New approaches aren’t innately better than venerable ones. Vendors and marketing departments are hoping the audience will hear “awesome magic code” and not ask too deep of questions on why it is “awesome” or “magic.” It’s a demeaning approach that implies that customers are unable to comprehend their code. When vendors say the word “algorithm”, I just mentally replace the term with “code” which normally makes the claim sound ridiculous or pedestrian. If you’re selling software, you are supposed to have code.

Machine Learning

Machine learning is often touted as the salvation of cybersecurity. Simply defined, machine learning is code that can identify patterns. Optical character recognition (OCR) is a fine, time tested example of machine learning. Programs have been created that can discern the difference between lowercase a & uppercase P. Fingerprint readers, facial recognition in Facebook photos and network security anomaly detection are other examples. Computer Science enthusiasts get real excited about computers being able to pick out these patterns. It seems like a real big deal to those of us trying to make computers more useful.

The need & progress of machine learning reveals the superiority of the human mind over computers. While I still am surprised when Facebook automatically recognizes and tags my face in a photo, my niece has been able to do that (as well as identifying other family members, PBS celebrities, songs, letters & characters) since she was 3 years old.

The human brain is vastly superior to computers in pattern recognition. Before a child can speak or do arithmetic she can identify faces, shapes & patterns.

Machine learning is interesting in computer science and as it evolves past a human toddler, it will provide assistance to human operators (like auto tagging my FB photos) but will not be comparable to the power of the human brain for the foreseeable future because of the innate qualities of how people & machines process information.

FM Summary

In this installment we’ve uncovered that algorithms aren’t magic, it’s just code. We also examined that machine learning is the difficult task of teaching a computer to identify patterns that are an innate ability in humans. In the next installment we’ll examine how cognition & AI compare & how they can work together.

Share This