Python...meh
I recently undertook a project to convert as many Excel functions to Power BI's DAX as possible. Perhaps oddly, I actually learned Python along the way. Now, I've written in all types of programming and scripting languages, probably 20 or 30 at the very least, but I had never written a lick of Python or even installed it. But why did I learn Python during a project to convert Excel functions to DAX? Well, it twas...Lifebuoy. OK, actually it was Excel's GAMMA function. The math for Gamma is not fun. And Excel's "documentation" was certainly no help. Apparently, Gamma is notoriously hard to integrate; particularly so if you've been blinded by soap poisoning or bombarded with its rays...
So, what is a classically trained engineer to do when the math gets hard? You fudge it and resort to numerical methods of course. So that’s what I did. And I got to within 12 decimal places of accuracy. So, good job me. I got there because I happened across a Python implementation of Lanczo’s approximation of Gamma. So I had to install Python and learn Python in order to it translate the code to DAX.
Translating the Python code to DAX was actually pretty cool. It was tricky because there was a recursive element in the code as well as a for loop! Both of those are normally non-starters for DAX but where there is a will there is a way!
The Python code for reference is:
from cmath import sin, sqrt, pi, exp p = [676.5203681218851 ,-1259.1392167224028 ,771.32342877765313 ,-176.61502916214059 ,12.507343278686905 ,-0.13857109526572012 ,9.9843695780195716e-6 ,1.5056327351493116e-7 ] EPSILON = 1e-07 def drop_imag(z): if abs(z.imag) <= EPSILON: z = z.real return z def gamma(z): z = complex(z) if z.real < 0.5: y = pi / (sin(pi*z) * gamma(1-z)) # Reflection formula else: z -= 1 x = 0.99999999999980993 for (i, pval) in enumerate(p): x += pval / (z+i+1) t = z + len(p) - 0.5 y = sqrt(2*pi) * t**(z+0.5) * exp(-t) * x return drop_imag(y) print(gamma(1))
The equivalent DAX code is:
GAMMA = VAR __zInput = MAX('Table'[z]) VAR __p = { (0, 676.5203681218851), (1, -1259.1392167224028), (2, 771.32342877765313), (3, -176.61502916214059), (4, 12.507343278686905), (5, -0.13857109526572012), (6, 9.9843695780195716e-6), (7, 1.5056327351493116e-7) } VAR __EPSILON = 1e-7 VAR __z = IF(__zInput < 0.5, 1 - __zInput - 1,__zInput - 1) VAR __pTable = ADDCOLUMNS( __p, "x",[Value2] / (__z + [Value1] + 1) ) VAR __x = 0.99999999999980993 + SUMX(__pTable,[x]) VAR __t = __z + COUNTROWS(__pTable) - .5 VAR __y = IF( __zInput < 0.5, PI() / (SIN(PI() * __zInput) * SQRT(2*PI()) * POWER(__t,__z+0.5) * EXP(-1*__t) * __x), SQRT(2*PI()) * POWER(__t,__z+0.5) * EXP(-1*__t) * __x ) RETURN __y
So what do I have to say about Python now that I have added it to my repertoire of coding languages? Well, I have to say that I am...well...I’m...I’m pretty unimpressed actually. I guess with all of the hype around Python and it’s supposedly exponential growth in popularity, I honestly and sincerely do not see the attraction. I mean it’s a programming language and really not at all much different than most of the other languages I know. I mean, you can have variables, and you can loop, and write functions and, well, you can do exactly what you can do in every other general purpose programming language. Yaaaaaaayyyyyy......?
But all the hype. Man, going into this I was told the following were unequivocally and 100% true:
- Python is much more accessible and easier to learn than other programming languages
- Python has huge tracts of libraries...
- Python has big data
- Python is more understandable and readable than other programming languages
- Python is FAR more productive, requiring less lines of code
...
>cracks knuckles<
...
OK folks, let’s do these in order:
Python is much more accessible and easier to learn than other programming languages
"I mean, gee willikers Harvey, I have an IQ of 12 and even I can learn Python." No…no you cannot. You see, Python is no easier or harder to learn than any other programming language because syntax is the easy part. The hard part is actually learning the concepts of programming. To learn how to think in terms of variables and loops and recursion. That’s the hard part about learning to program, not which language you pick as your first one. Once you learn to program, most other languages are a breeze to pick up unless there is something fundamentally different about the language, like how lists are so central to Lisp that even the program code is made of lists or how context is so central to DAX. New concepts are hard. New syntax is not. So…no.
Python has huge...tracts of libraries...
"You can’t believe the options that come with this baby, like the TruCoat, you gotta get the TruCoat or you’ll get the oxidation…" Let’s be clear, this makes zero sense and is clearly not a differentiator. Every programming language...wait, that's not strong enough...EVERY programming language on the planet that has been around for any length of time has zillions of lines of code in the form of functions, modules and other pre-written packages. Pick one, Perl, R, javascript, C#, you name it. They all have mountains of code libraries, API's and other things that that you can just pick up and use. So, this is an ignorant comment because it's all 1's and 0's in the end people so it's not like a Python library can do anything more special than a similar library written in another language.
Python has big data
Ooo wee, I gotta see this! A programming language thats gotz the big data! Scintillating. Clearly no. Let’s use one over-hyped meaningless term to sell something else that we want to over hype. You mean Python has code libraries for accessing data that is stored as shards? See “huge tracts of libraries” above.
Python is more understandable and readable than other programming languages
"My 2 ? week old learned how to read and understand Python code before she could even lift her head." Wow, that’s some amazingly readable code. Let’s put this one to the test. Here are standard for loop constructs in a bunch of different languages, including Python:
Python for x in range(0, 3): print("value of x: %d" % (x)) C for( x = 0; x < 3; x = x + 1 ){ printf("value of x: %d\n", x); } C++ for( x = 0; x < 3; x = x + 1 ) { cout << "value of x: " << x << endl; } C# for ( x = 0; x < 3; x++) { Console.WriteLine("value of x: " & x); } Java for( x=0; x < 3; x++){ System.out.println("value of x: " + x); } Pascal for x:= 0 to 2 do writeln('value of x: ', x); R for ( x in 0:2 ) { print(paste("value of x: ", x)) } Perl for( $x = 0; $x < 3; $x = $x + 1 ) { print "value of x: $x\n"; } PHP for ($x = 0; $x < 3; $x++) { echo "The number is: $x <br>"; }
Wow, yeah, obviously waaaaayyyy more readable, that Python code. You know, because the colon there, that's way more readable and intuitive than curly brackets. I mean, curly bracket's? What moron uses curly brackets to encapsulate a block of something...cough...math. And who uses a & or + to concatenate text when "%d" % (x)" is soooooo much more intuitive? Mmmm...hmmm.
Except…hmmm…now that I look at it, out of the nine for loops listed, the Python code is the ONLY code where it is not absolutely obvious that x never gets to 3. Huh, how about that? I mean, the plain reading of the code “for x in range(0, 3):” surely; to the lay person, means that x gets to 3 because, you know, 3 being “in the range” and all. Except, mysteriously that’s not the case. Apparently 0 is "in range" but somehow 3 is not “in range”. How about that. Amazing. What an odd, unintuitive, dumb and downright ignorant construct. Huh. But, in every other language it’s pretty clear that x stays “less than 3”.
Honestly, if I’m looking at code readability, I have to tip my hat to Pascal. So, more readable? That’s a big, big negative.
Python is FAR more productive, requiring less lines of code
"I tell you Earl, you’ll be so much more productive in Python you can replace those 50 Java developers with just a single Python developer by golly! Yeeee haawwww!!" This is the whopper lie and this one kills me for various reasons.
First, is this really the standard we want for what programming language to use, which one has less lines of code? Personally, I feel that this is an incredibly misguided and dangerous concept. I mean, if that’s the standard, what in the world did we ever move away from Perl for? If this were the case, shouldn’t we all be coding Perl, the folks who were so proud of obfuscated Perl code that they held contests to see who could write Perl code in the least number of bytes? Bytes. Bytes I tell you.
Go ahead Python programmers, write an alphametric solver in less than 103 bytes like this Perl code:
while($_=$a=pop){/([a-z])/i?map{($t=$a)=~s/$1/$_/g; $a!~/$_/&&push@ARGV,"$t"}0..9:/\b0\d/||eval&&print}
Any takers? No? Wait, what's that? Snake got your tongue? Yeah, that's what I thought. All talk and no code. Just FYI by the way that code comes by way of Michael Woodhams and will take something like send+more==money and solve it to 9567+1085==10652.
So, that argument holds no water. But, even if, even if you could, that’s the criteria for why Python is better? You can write fewer lines of code… Or is it the same amount of code but you imported a library? Which, if that’s the case, see “Python has huge tracts of libraries” above. Earth to Python, you didn’t invent reusable code. Functions have been around for a long time people. And less lines of code is not necessarily a good thing.
OK, another reason in a similar vein. Look at the 9 for loops above. Every one of them can be written as a single line of code. So then if we write those for loops as a single line of code, did we just double or triple our productivity? Clearly we did not. So what is this nonsense about productivity and lines of code?
Oh, and just one more tid bit. If you look at the Gamma code above, the Python code is 32 lines, the DAX code is 30 lines…
Sooooo… Yeaahhhh…
Final thoughts on Python
Look, I'm going to clear up the reason behind why Python is so popular. It's not exactly a mystery and it has absolutely nothing to do with any supposed "benefits". It is popular for the sole reason that Google seemed to inexplicably glom on to it in 2005 or thereabouts. That's it. Nobody had ever heard of it or cared about it until then despite the fact that it had been around for 15 years.
Now, I don't know for sure, but I'm guessing the adoption by Google went something along the lines of the following:
Some noob programmer Python fanboy at Google that didn't know anything other than Python got put in charge of something way above their pay grade and this noob programmer Python fanboy was all like “Hey, we're Google and I’m super cool and edgy and doing something way above my pay grade and we're like going to do stuff different and stuff and not be all like controlled by ‘the man’ and stuff because I’m a rebel and edgy and stuff. So we’re going to use this brand new programming language called Python because I’m too dumb to know it’s been around for 15 years and nobody uses it because, you know, we’re Google and we’re like super cool and hip and different and not at all evil because, you know, it says so in our corporate motto…and stuff”.
And then, everyone else on the team was like: “Right on, we’re Google and we’re also super cool and hip and stuff and we’re going to do something different because we’re also rebels and for no other reason than we can. Middle fingers to 'the man'...and stuff. So we’re going to use this new cool programming language called Python because we’re also too dumb to know that it’s been around for 15 years and nobody uses it or cares about it.”
And thus, Google adopted Python. And, since Google adopted Python, other tech folks took notice and wondered what this new, hip, super cool, trendy programming language was that they were missing out on because THEY were also too dumb to know that Python had been around for 15 years and nobody cared about it. And there you have it. Which, by the by, is exactly the same reason why cancer sticks became popular. So...good job noob Google hipster Python fanboy?
On the plus side though, Python is absolutely, god-awfully dog slow and the only language in existence that breaks because you don't indent things. So…it’s got that going for it. So Python fanboys and fangirls, give it a rest. Your language is pretty much trash and certainly no better than all of the other languages that are out there. The supposed benefits you tout really just make it painfully obvious that you're new to this and have apparently never actually coded in another language before.
Conclusion
Look, I’m really not just bagging on Python here. I pretty much feel this way about most languages at this point. I mean, there must be nearly a thousand programming languages listed on Wikipedia. Guess what? They all reduce to binary code at some point so...can anyone say redundant? Gee, I wonder how big our "tracts of libraries" would be if we were all coding the same programming language for the last few decades? Seriously, how much further along as an industry or even as a society if we had leveraged the entire collective knowledge and wisdom of millions of programmers into a single cohesive code base versus constantly reinventing the wheel?
Think about that for a moment. Popular general purpose programming languages over the years have gone from Pascal, C, Java and now Python just to name a few. I'm not against change, far from it. But change for good reasons. Moving from one general purpose programming language to another does not solve any of the fundamental issues with programming. It's like the old man in that Monty Python skit, "I'm not dead yet". Too bad, bonk on the head... But, in particular, recycling a decades old language from the dust bin of history is perhaps the WORST way to make any real progress.
We information technology wonks need to wise up because all we are doing is solving the same problem a couple thousand different times. Every so often we move from one general purpose programming language to another and for what? Where is the value add? I mean, at least when the music industry does it there is some value added, ability to skip between tracks, not have your tape eaten and now not have to worry about scratching your CD. But trading one programming language for another is beyond useless. It all becomes 1's and 0's in the end people. For an industry that is literally built upon the concept of time saving and efficiency, we sure as heck don't practice what we preach.
Although you make some points that I agree with, I can't favor curly brackets. "Simplicity?is?the ultimate sophistication." --Some smart person but ostensibly not da Vinci.
Full Stack Developer & Data Analyst
4 年I never saw things this way... but it is true it as been around that much...
Principal - Triatin PTY LTD. Business Intelligence & Data Engineering, data warehousing, quality and integration, augmented BI. #reporting #analytics #businessobjects #sql #powerbi #tableau #aws #azure #teradata
4 年umm ... I can code in basic... oh and cobol... and I can spell R... Actually you are partly right, with one caveat being the variations in some languages for different data storage mechanisms... Relational databases (NoSQL and Partitionable included) do store data somewhat differently to post hierarchical which are different again to ISAM which is different again to free text stores. Some languages have roots un UX mechanisms (C, PHP) some in Analytics (R, Python) and some in transaction processing (Cobol, Pascal) but today they almost all have found ways to cover that ground and become much more homogenous. Still you will find subtle strengths in their home ground. I prefer R to Python myself but that's more because of its pedigree in New Zealand...