Many, If Not Most, Programmers Do Not Really Program Anymore
Raéd Alexander Ayyad
#VeritatemDilexi ... "The most serious mistakes are not being made as a result of wrong answers; the true dangerous thing is asking the wrong question." —Peter Drucker
Flat-packs...
Most programmers I've come across, over the past decade and a half, do not really program anymore, they assemble premanufactured blocks. Today's programmers are more like highly-paid IKEA flat-pack assembly professionals than professional furniture makers.
That said, I think that this is alright, with an exception (for them): such 'assembly' skills fall into the core craft of "automation!" Yes, there are exceptional programming geeks out there—many probably hired by government cybersecurity squads, and other types of law-enforcement and intelligence agencies... others are entrepreneurs and innovators, but the latter—I think—are the exception nowadays.
Like what I've seen in the field of project management, I have witnessed another not-so-flattering trait that's become prevalent among "professional" programmers: copying existing frameworks, and then making a few changes to the 'facade,' the skin, so to 'differentiate' their work. That leads to cookie-cutter solutions that are not really fully compatible with their intended uses, and by not interrogating the older code being copied, the latter can open cybersecurity gaps that are best not embedded.
At the beginning... Issues with teaching computer science
In an article published by "Inside Higher Ed," the authors observe that the fast pace of developments presents teaching challenges, too, in the Computer Science (CompSci) field. "Every hour in 2019, more than three artificial intelligence preprints were submitted to arXiv—an open-access repository of electronic scientific preprints. That rate was over 148 times faster than in 1994" (about the time I started real programming), according to a Journal of Informetrics study. On the AI subtopic of deep learning alone, more than one preprint was submitted every hour—a 1,064-fold increase from the 1994 rate.
It’s kind of like in medical school when they talk about the ‘half-life of knowledge.’ The medical school dean tells graduates, "In five years, half of what we tell you will turn out to be false,” said Alexei Efros, computer science professor at the University of California, Berkeley. “The half-life of knowledge in computer science is quite short. In machine learning, it is about three months.” This, alone, in my view, makes continuing-education in computer programming a paramount quest for those who intend to make a career in software/application development (AppDev)!
I have also observed that most schools do not mandate teaching dedicated critical-thinking, and logic & ethics, classes as part of the mandatory curriculum for CompSci. Such disciplines are critical for training programmers to foresee consequences and better plan for them; moreover, such knowledge also—if taught properly—should create a better state of consciousness about the ethics aspects of what we create and how we use it. Ideally, teaching such should start being taught, by middle school, and then advanced on past high school.
The subject of "ethics" has been brought front-and-center, today, especially that the use of artificial intelligence (AI) is becoming much more prevalent. For the sake of maintaining a degree of focus on the subject of this article, we shall not go into an in-depth discussion on education, the critical use of logic, and ethics, and I will aim to author another article where we can delve into the matter of education vis-à-vis AppDev at a later time.
"Since the first year of studying programming at university I have known in my heart that computer programming is not meant for me, but I was afraid to do anything about it and here I am now 12 years later programming with no passion. I am a career programmer and an average one at best." -So You Don't Want to be a Programmer After All
Lessons learned from aviation
In past articles, I have used lessons I've learned from the aviation field, especially piloting, as a reflection point, a contrast, to critique other fields that require technical aptitude, creativity, and critical communication skills... so, I intend to do the same here:
The lesson that's applicable, here, is a matter of reoccurring hands-on training, whether on the simulator, or, where a pilot takes-over control of an aircraft, from the automation (autopilot) during certain phases that demand the application of all piloting skills, such as during preparation for, and managing, the landing.
By doing so, a pilot keeps their technical skills sharpened by not heavily relying on automation. Automation, today, can actually fly a plane from take-off to landing, and even beyond—once regulations allow, as do "wizards" and artificial intelligence when it comes to computer coding.
If pilots do not have the opportunity to practice such, and they rely, heavily, on the automation, and then the opportunity arises for them to manage flying an aircraft when the electrics fail—for example... I trust that you can see where I'm going with this argument... it'll be bad news!
Another way of looking at this is via the lens of something we do every day: making phone calls. Until a few years after college, when mobile phones became relatively prevalent, I had memorized every phone number of every phone our family had, those of all my friends, and most of my immediate family members... that's not the case, anymore! Why? I have become accustomed to relying on saving the data on my phone directory ("address book") and just "speed-dialing," or re-dialing, numbers when I want to initiate a call!
The latter reminds me of a relatively humorous anecdote I read on one of the social media platforms, which stated: "I hope that I do not get arrested, where they would take away my mobile phone, becasue when the time comes to make that one call from jail, I will not remember the number of anyone who can help me, not even my parents!"
Optimization
Driven by constraints...
The same applies to coding. When I started my venture into programming, in the latter 1980s, as a child, there were a few constraints that forced us to produce 'quality optimized code,' in contrast with today:
Moreover, later, in school and college, our code was expected to be very efficient and very legible (where any person can pick it up, and understand what we are, or were, doing).
The common abundance of all these resources, since the turn of the century, at very reasonable costs, have led to the creation of a class of very lazy and untidy programmers, who lack creativity... I think that is disappointing, if not just simply deplorable.
I truly believe that one of the major gaps—when it comes to cybersecurity risks—is mediocre/sloppy programming, and a heavy reliance on impressing the stakeholders, and end users, with flashy graphics and GUIs to approve the final product, rather than on the provable resilience of the programming.
When it comes to programming, most AppDev team managers place a heavy emphases on appeasing the stakeholders with how things "look..." a lot of the rest is cookie-cutter. The latter is most notable when working with many offshore developers. The problem with that approach is that most stakeholders are tech and programming illiterate, and are not qualified to critique the efficiency and effectiveness of code, so, they don't.
During quality checks, most the work that is delegated to teams of programmers, whether on-shore or off-shore, focuses on confirming that "the code works," not on 'how it works." There are many ways to communicate a message, and the same applies to coding—which is simply another living language, just like when we speak English, French, Arabic, Spanish or Mandarin... etc..
Computer programming is the same; on the "outside" it may seem that the output of a code is meeting the expectations of the stakeholders, but in the "bowels of the program" there could be all sorts of convoluted ways to generate that output, and some of those ways can create loop-holes that can be exploited (like when someone "twists words you've said" when speaking) and become cybersecurity threats.
The breakneck speed that developers are expected to create code, nowadays, to meet outrageous deadlines, which are enforced by mediocre yes-men project managers who should know better, is a notable contributor to the problem.
In proper scientific research, there are, always, entities that must do peer-reviews, to validate the sanctity of the work, becasue it's often critical. That method is not really applied to computer programming, which surly should be, especially as the reliance on "artificial intelligence" (AI) will become more of a norm!
On that note, I want to emphasis my extreme disappointment in the fact that those who do peer-reviews in the scientific fields are not better recognized, nor rewarded, for the most critical, and fantastic, work they do. We must remediate such a glaring gap, so to encourage the professional act of making peer-reviews part of the standard operating procedures (SOPs) in research and development (R&D).
Furthermore, I think that when peer-reviewers verify any gaps or errors in the material they are working with, they should be further rewarded with notable bonuses... if they accomplish such under tight constraints of time, budget or scope, then they should be rewarded doubly!
Vomiting code...
This "modern" methodology of vomiting out code needs to come to an end. Another thing that must come to an end is the habit of making creating hardware solutions as a primary concern, while leaving the software aspects as afterthoughts. The latter is exactly what leads to companies cobbling-up such nightmarish software packages that are in constant costly need for patching and updating, while wasting, and even risking, users' assets.
It is also to be pointed out that the ignorant consumer (promontory stakeholder) plays a big role in this outcome of creating mediocre software, and I must say that this is something that many software developers take into consideration: "what they don't know doesn't hurt them!"
Most consumers are oblivious to the fact that every time a software package or "app" is updated, it usually demands more resources on the customers platforms (real-estate)... resources such as more hard-drive space, ROM, RAM and CPU power... this issue is what often leads to a customer's technology becoming obsolete, which forces them to have to purchase new technology, or upgrading their excising tech, at notable cost. You can actually say that this is a form of engineered obsolescence.
True, the latter problem can be remediated by using [outsourced] "Cloud Computing"*, but to gain this advantage, the customer had to lose autonomy and various degrees of control over their infrastructure. Major corporations with available financial assets may not worry much about that "sacrifice," becasue in case of a failure, they'd gladly pass the buck and blame the cloud service-provider for any resulting calamity... actually, this marketing point has been strategically used vis-à-vis the "Big-4" consulting groups as a 'perk,' a form of insurance. From a smaller scope perspective, I can demonstrate that such can be a trap that's hard to get out of... to follow are some case-in-points...
[* Back in the 1960s, called remote job entry & timesharing... yeah, "cloud" isn't new]
Never trust a business model that sticks it to the "small guy;" they'll get to abusing the "big guy" soon enough—if they don't sell, or go out of business, first!
“If you're horrible to me, I'm going to write a song about it, and you won't like it. That's how I operate.” ― Taylor Swift
One of those perfect storms...
I had the privilege of being distinguished by having developed the first family website in the history of the internet and internet hosting, which included an encyclopedia of researched historical facts on the family lineage, and the different regions of the world, images, a chat section (before MySpace & Facebook), and the test-bed for an integrated free VOIP voice service (during the dial-up era), and e-commerce page for selling promotional family memorabilia to finance said site; it was called AyyadCentral.net. People as far as Scandinavian nations were interacting with it!
During an extended business assignment overseas, the hosting company had a problem processing the monthly dues, so, what do you think that they did? Nope, they didn't contact me, they took down the website, and then cancelled my account, and deleted all my coding and files... a decade's worth!
When I discovered that, after my return home to Texas, I contacted their customer service department thinking that the data was backed-up in some disaster recovery system... nope; it was all lost. What compounded the problem, is that the upgraded Seagate hard-drive, of Toshiba laptop I used to develop the code, and save it, failed during the return trip, and wasn't recoverable!
So, perhaps that example is one of freak accidents, and unexpected misfortune, but the fact is that I had overly relied on the service providers to protect my said assets, that I didn't think that I needed to back things up*... it was this incident that lead me to pursue, and then achieve, expertise in disaster recovery and business resumption (DR/BR).
Not YourTube, but surely YourProblem...
If you are not the paying customer, you are the product been sold!
领英推荐
I had, weekly, uploaded business podcasts, and video ads, I produced, under the YouTube channel Teknikr?n... One day, as I was about to upload a new production, I discovered that the collection that was supposed to be on the server was no longer... again, there was no explanation, communications, etc.. Being that it was a "free" account, I couldn't get any support to verify the reason.
The missing link at LinkedIn...
Ah! LinkedIn, the so-called "professional" love-hate relationship, the authoring platform where I've published & republished many an article over the years.
Particularly, since Microsoft took over the company, they keep removing features, and making illogical damaging changes (from the user's perspective)... changes, such as deleting embedded content in published articles, and changing the formatting of texts, and layouts, without any notification to, or permission from, the user!
You you see? We, our professional image, and even digital well-being, are at the mercy of businesses and the bosses of the code developers—aside of their incompetent developers and support teams!
Wyze? Certainly, neither wise, nor intelligent... Crooks!
I give you Wyze! A company whose mediocre coding and cyber security configuration has been "utilized" by malicious actors' access more than the sex workers of the Red Light District in Amsterdam!
Wyze decided to integrate Two-Factor Authentication (2FA) to the login process that allows the user to access and control their Wyze "smart-home" hardware and 'security' infrastructure. There were no communications sent to the users' email addresses, the same addresses they send marketing material to all the time! Here comes my beloved Murphy!...
... One evening, I discovered that the link between my "smart home" Wyze and Google accounts was not working; such has happened before, and the remedy was for me to open each app, and basically do a "refresh." Not this time... this time, I was not able to log onto my Wyze account, so, I started troubleshooting by restarting my smartphone, which led to no change. I tried to log on from my desktop, which, also, failed. So, I visited the Wyze website to use their knowledge base, which recommended that I uninstall the app from my smartphone, and then reinstall it. I followed the instructions, but that made things worse, and I was even locked-out from access!
With frustration, I waited, thinking that their systems may be down—which also has happened before, but, to no avail. I attempted to report the problem via text, to which they kept repeating to me the cookie-cutter solution mentioned on the knowledge base.
Eventually I was able to reach support, which weren't able to solve the problem, and—supposedly—escalated the matter to the second tier support, who totally blew me off (a paying customer).
Their final solution was for me to sacrifice all the archived security footage (which involved evidence of a break-in I have reported to law enforcement), and also reset and reconfigure ALL my Wyze "smart" devices (cameras, switches, plugs, watches, scales, lights, etc.) from scratch, at the penalty of my own resources.
It is still near impossible to be connected with anyone past the first level support, who just read solutions from the same online knowledge base that the customers have access to. There is still no professional follow-up, or an escalation, venue either. Sadly, Wyze have become the dictionary demonstration of what the phrase "shit show" means.
... and so on, my friends! I can serenade, and amuse, you with a large compilation of such examples that I, and people I've met, have experienced... I'm sure that you have a few of your own such stories, too!
A web of deceit and exploitation! ... Web.com
Most recently, Web.com bought iPage... there were no notifications shared with the iPage customers. One of the first things the decision-makers of web.com did to iPage is to destroy two of the best best differentiators the company had, by:
What recourse did the longtime loyal iPage brand customer have—who became aware of the changes and said challenges? None but to either accept the less services for the higher costs, or, to start searching for a new service provider, and then go through the rigamarole of migrating all their online presence to the new provider.
Why are we allowing a business to have such unfettered power? Not to notify millions of customers of impending critical changes to the organization, which will impact them directly—and notably, financially, and service expectations wise? This is another one of those gaps that are triggered by the incompetence, if not corruption, of our politicians who govern and regulate consumer protection practices.
Placing the burden of detection and the burden of penalty on the customer
As in the title of my book, The "Good-Enough Society," what tends to drive the creation and production of any product nowadays, is mediocrity. What do I mean by that? Well, since most customers are not tech-savvy, it is every easy to "pull the wool over their eyes," hence, like dogs and cats, it's easy to distract them with flashy colorful objects from the inherit incompetence in design and application... jury-rigging, like with Microsoft Windows, Capital One, Paramount+, and Boeing (the MCAS system software)!
Moreover, since most code is proprietary and not share with the public makes it near impossible to catch any "bugs" within it, in advance to risks and tangible problems. The latter issue is a major selling point for open-sourced software—among many advantages.
Customers, today, have become accustomed to using the "plug & play" model of operation and compatibility. They just want to flip the switch and get the software running in a form that is close-enough to—resembles—their expectations (the expectations marketed to them)... how that is accomplished is not of interest to them at all! Just like with corruption in the business world, it take about a decade before a critical gap in the code of a program is exposed and exploited. Of course, just like with financial corruption, we know that the laws, and law enforcement trails such by about an additional decade of time... the latter, at least in the US, is also driven by the utmost ignorance of our lawmakers and politicians—when it comes to IT!
So, what should be done?
I think that by improving the process all-through from inception-of-thought, to utilization by customer, all those concerned will benefit considerably. Improving the following critical aspects of application, or system, development will render better outcomes, including more resilient cybersecurity:
"It's the cost of doing business" no more!
Understanding the nature of the beast, and since the trigger that leads to businesses creating shoddy products is more often than not their desire to save on cost of production, to maximize profits, the financial penalties must be notably higher than any "savings" they may have harvested.
Today, the use of software has come a long way from the days where it was mostly the domain of large corporations, government, and military use. The days when any damage can be relatively contained quickly—and quietly. Today, information technology permeates every aspect of our daily existence, especially among the industrial nations. The Internet of Things (IoT) makes it even more integrated into our lives and livelihoods. IT is no longer a luxury for a few rely on using; hence, just like with potable water, we must insure it's treatment and protection with the utmost of seriousness.
I don't care, using this technology doesn't impact me...
For those who say: "I don't care, I don't use this garbage; I'm old fashioned!"; "I love off the grid"... We can revisit your thoughts the day your identity is stolen, and your bank account is hacked—which your kind is the primary target for such efforts, costing, American's alone, $43-Billion in losses, in 2023!
More seriously, if someday, your life-saving medical technology, which relies on IT to be able to function, is hacked (500 hacks, effecting 200-thousand people, in 2023 alone), you will be singing another tune.
Without the information technology that you disparage, much of the incredible solutions we have, and use, today, will not be possible, especially in the medical fields. So, do not be too quick to dismiss the tremendous value of the use of IT, not only to improve the quality of our lives, but to save them, too.
All this is a headache; let's just go back to the "simpler times"... not!
The solution is not to stop using the technology; the solution is to extract human avarice and cupidity, and the lack of a moral compass, out of the business equation. I know, that may sound like a tall order, but we already have thousands of years of failure trends driven by these qualities that demonstrate the value in investing in the task of extracting them from the culture of any business entity in order to maintain an attainable, continuing, healthy growth curve—while eliminating the need for transitioning through the costly phases of struggles.
What do you y'all think? Feel free to share your critical constructive thoughts on the subject—if you have any.
?? OTHER ARTICLES BY AUTHOR: https://www.dhirubhai.net/in/raedmalexanderayyad/recent-activity/articles/