Thursday, February 29, 2024

ipod click and skip

Some hidden history of the iPod. My boss digs Steve Jobs' attitudes about excellence - and an aptitude for taking resources at hand (in the case of the iPod, a new small Toshiba hard drive) and applying them in novel ways.

Two things I hadn't heard much about:

  1. the signature click wheel has heavily drawn from a phone, the Bang & Olufsen BeoCom 6000
  2. Part of the secret sauce was a large 32Mb "skip buffer" - advertised as "20 minute skip protection" (remember this is an age of jostled portable CD players leading to poor experience) its true purpose was buffering of songs, so the device could load a few songs at once rather than have the little hard drive constantly spinning, and so tripling the battery life to meet critical performance metrics.

Wednesday, February 28, 2024

dutch tulips anyone?

Really excellent overview charting the first dot com bust through to the current dreams of AI as the savior for tech - going to copy/paste it here for posterity...

(I would say too, it's really bad the only micro-/nano-transaction model we came up with for the Internet was ads and all the privacy breaking and other nonsense that comes up with.)

If anyone wants to know why every tech company in the world right now is clamoring for AI like drowned rats scrabbling to board a ship, I decided to make a post to explain what's happening.


(Disclaimer to start: I'm a software engineer who's been employed full time since 2018. I am not a historian nor an overconfident Youtube essayist, so this post is my working knowledge of what I see around me and the logical bridges between pieces.)


Okay anyway. The explanation starts further back than what's going on now. I'm gonna start with the year 2000. The Dot Com Bubble just spectacularly burst. The model of "we get the users first, we learn how to profit off them later" went out in a no-money-having bang (remember this, it will be relevant later). A lot of money was lost. A lot of people ended up out of a job. A lot of startup companies went under. Investors left with a sour taste in their mouth and, in general, investment in the internet stayed pretty cooled for that decade. This was, in my opinion, very good for the internet as it was an era not suffocating under the grip of mega-corporation oligarchs and was, instead, filled with Club Penguin and I Can Haz Cheezburger websites. 


Then around the 2010-2012 years, a few things happened. Interest rates got low, and then lower. Facebook got huge. The iPhone took off. And suddenly there was a huge new potential market of internet users and phone-havers, and the cheap money was available to start backing new tech startup companies trying to hop on this opportunity. Companies like Uber, Netflix, and Amazon either started in this time, or hit their ramp-up in these years by shifting focus to the internet and apps.


Now, every start-up tech company dreaming of being the next big thing has one thing in common: they need to start off by getting themselves massively in debt. Because before you can turn a profit you need to first spend money on employees and spend money on equipment and spend money on data centers and spend money on advertising and spend money on scale and and and


But also, everyone wants to be on the ship for The Next Big Thing that takes off to the moon.


So there is a mutual interest between new tech companies, and venture capitalists who are willing to invest $$$ into said new tech companies. Because if the venture capitalists can identify a prize pig and get in early, that money could come back to them 100-fold or 1,000-fold. In fact it hardly matters if they invest in 10 or 20 total bust projects along the way to find that unicorn. 


But also, becoming profitable takes time. And that might mean being in debt for a long long time before that rocket ship takes off to make everyone onboard a gazzilionaire. 


But luckily, for tech startup bros and venture capitalists, being in debt in the 2010's was cheap, and it only got cheaper between 2010 and 2020. If people could secure loans for ~3% or 4% annual interest, well then a $100,000 loan only really costs $3,000 of interest a year to keep afloat. And if inflation is higher than that or at least similar, you're still beating the system. 


So from 2010 through early 2022, times were good for tech companies. Startups could take off with massive growth, showing massive potential for something, and venture capitalists would throw infinite money at them in the hopes of pegging just one winner who will take off. And supporting the struggling investments or the long-haulers remained pretty cheap to keep funding. 


You hear constantly about "Such and such app has 10-bazillion users gained over the last 10 years and has never once been profitable", yet the thing keeps chugging along because the investors backing it aren't stressed about the immediate future, and are still banking on that "eventually" when it learns how to really monetize its users and turn that profit.


The pandemic in 2020 took a magnifying-glass-in-the-sun effect to this, as EVERYTHING was forcibly turned online which pumped a ton of money and workers into tech investment. Simultaneously, money got really REALLY cheap, bottoming out with historic lows for interest rates. 


Then the tide changed with the massive inflation that struck late 2021. Because this all-gas no-brakes state of things was also contributing to off-the-rails inflation (along with your standard-fare greedflation and price gouging, given the extremely convenient excuses of pandemic hardships and supply chain issues). The federal reserve whipped out interest rate hikes to try to curb this huge inflation, which is like a fire extinguisher dousing and suffocating your really-cool, actively-on-fire party where everyone else is burning but you're in the pool. And then they did this more, and then more. And the financial climate followed suit. And suddenly money was not cheap anymore, and new loans became expensive, because loans that used to compound at 2% a year are now compounding at 7 or 8% which, in the language of compounding, is a HUGE difference. A $100,000 loan at a 2% interest rate, if not repaid a single cent in 10 years, accrues to $121,899.  A $100,000 loan at an 8% interest rate, if not repaid a single cent in 10 years, more than doubles to $215,892.


Now it is scary and risky to throw money at "could eventually be profitable" tech companies. Now investors are watching companies burn through their current funding and, when the companies come back asking for more, investors are tightening their coin purses instead. The bill is coming due. The free money is drying up and companies are under compounding pressure to produce a profit for their waiting investors who are now done waiting. 


You get enshittification. You get quality going down and price going up. You get "now that you're a captive audience here, we're forcing ads or we're forcing subscriptions on you." Don't get me wrong, the plan was ALWAYS to monetize the users. It's just that it's come earlier than expected, with way more feet-to-the-fire than these companies were expecting. ESPECIALLY with Wall Street as the other factor in funding (public) companies, where Wall Street exhibits roughly the same temperament as a baby screaming crying upset that it's soiled its own diaper (maybe that's too mean a comparison to babies), and now companies are being put through the wringer for anything LESS than infinite growth that Wall Street demands of them. 


Internal to the tech industry, you get MASSIVE wide-spread layoffs. You get an industry that used to be easy to land multiple job offers shriveling up and leaving recent graduates in a desperately awful situation where no company is hiring and the market is flooded with laid-off workers trying to get back on their feet. 


Because those coin-purse-clutching investors DO love virtue-signaling efforts from companies that say "See! We're not being frivolous with your money! We only spend on the essentials." And this is true even for MASSIVE, PROFITABLE companies, because those companies' value is based on the Rich Person Feeling Graph (their stock) rather than the literal profit money. A company making a genuine gazillion dollars a year still tears through layoffs and freezes hiring and removes the free batteries from the printer room (totally not speaking from experience, surely) because the investors LOVE when you cut costs and take away employee perks. The "beer on tap, ping pong table in the common area" era of tech is drying up. And we're still unionless. 


Never mind that last part.


And then in early 2023, AI (more specifically, Chat-GPT which is OpenAI's Large Language Model creation) tears its way into the tech scene with a meteor's amount of momentum. Here's Microsoft's prize pig, which it invested heavily in and is galivanting around the pig-show with, to the desperate jealousy and rapture of every other tech company and investor wishing it had that pig. And for the first time since the interest rate hikes, investors have dollar signs in their eyes, both venture capital and Wall Street alike. They're willing to restart the hose of money (even with the new risk) because this feels big enough for them to take the risk.


Now all these companies, who were in varying stages of sweating as their bill came due, or wringing their hands as their stock prices tanked, see a single glorious gold-plated rocket up out of here, the likes of which haven't been seen since the free money days. It's their ticket to buy time, and buy investors, and say "see THIS is what will wring money forth, finally, we promise, just let us show you."


To be clear, AI is NOT profitable yet. It's a money-sink. Perhaps a money-black-hole. But everyone in the space is so wowed by it that there is a wide-spread and powerful conviction that it will become profitable and earn its keep. (Let's be real, half of that profit "potential" is the promise of automating away jobs of pesky employees who peskily cost money.) It's a tech-space industrial revolution that will automate away skilled jobs, and getting in on the ground floor is the absolute best thing you can do to get your pie slice's worth.


It's the thing that will win investors back. It's the thing that will get the investment money coming in again (or, get it second-hand if the company can be the PROVIDER of something needed for AI, which other companies with venture-back will pay handsomely for). It's the thing companies are terrified of missing out on, lest it leave them utterly irrelevant in a future where not having AI-integration is like not having a mobile phone app for your company or not having a website. 


So I guess to reiterate on my earlier point: 


Drowned rats. Swimming to the one ship in sight. 


 

Wednesday, February 21, 2024

AI-yi-yi

 

Subprime Intelligence - Thoughtful piece about what is the strength of the boom in AI - especially as we may be nearing the top of an S-shaped curve of capabilities.

I think it's a great point that for much of the generative stuff, the "killer app" isn't there (though there are a lot of small use cases, and LLMs are certainly a boon to helping software developers navigate the thick woods of toolkits and languages we've created.)

It will also be interesting to see how the public's "AI-Radar" increases, and how much people start to notice and possibly get annoyed at the garish flatness of it all.

But especially the economics... hearing how Microsoft's "investment" in OpenAI was more like a donation of computer time, and other investments seem more about tech giants trying to ensure they're the ones selling the raw cloud horsepower... sometimes you wonder where we are all going with this.

catching up on newsletters

Catching up on frontend newsletters...

  •  I'm always inpressed by The Coding Train - I love p5/processing and these videos seem like a great way to learn programming.
  • This rise and possible fall of the square checkbox was a fun visual history.
  • The state of the states...
  • Back in the caveman days of Perl CGI, it sort of made sense that I used the same language for quick and dirty shellscripts and web programming.. but now that my go to is PHP for the latter (at least for the serverside)? Having a command line script inside <?php ?> always feels weird. Maybe I should switch to Bun for that?
  • Web Components in Ernest seems pretty deep

beginners who like TypeScript

from  3 Lessons from Building a Chrome Extension with TypeScript 

It is striking to me that the "#1 lesson learned" is 

TypeScript feels like Python but JavaScript. It might be a reductive way of describing it, but it feels a lot friendlier to read for a non-techie, self-taught pleb like myself.

The Type interface (even though not as heavily utilized in this project) is amazing. Not struggling with broken code just cause I've inputted a different type that is not supposed to be there.

The streamlined approach to getting what you want, rather than the run around that JS makes you do, helps provide a greater understanding of what is happening from function to function.

I find this such an alien take. For me TS is the opposite of "streamlined" - like I appreciate it for expressing shapes of arbitrary ad hoc JSON formats (though there are other ways of getting that) but am more likely resent the syntax mess it imposes - even by itself JS (and especially with React) is doing more juggling with a pretty small # of charaters ( { }  : =  ?) and so what any given character means in a snippet of code is way more dependent on context, and so harder to read at a glance. And "this only works with a fancy build environment" furthers that lack of a sense of being streamlined. While I know JS in the browser is gonna be some kind of sandbox in a sandbox, having to trust a mapping file to get me from the code I'm inspecting to the code I wrote never feels great. 

I dunno. Sometimes I worry BASIC, Perl and JS broke my brain, and that my feelings that duck typing greases the wheels more than it creates problems and that the speed of a quick interpretation-vs-compiling loop was worth its weight in gold are actually signs of derangement. Maybe Dijkstra was right when he he said

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."

(I always assumed he was talking more about line numbers....)

Like I don't MIND strong typing of primitive types, but would prefer the syntax was prettier than `

let foo: string | undefined;

(like from my Java days , I wish the descriptors came before the variable name)

Also the author didn't have to wrestle with some of TypeScript's inheritance ...

Or maybe brains are just different. 

FOLLOWUP:
On a private Slack we got into talking about the pros/cons of types. 

I wrote

Sometimes it feels like "type" as the term for for both primitive types and key/value conglomerations is weird.

like I know if you were thinking in terms of classes it should all be the same thing. But JS is very much not really that class centric..


Ashish wrote 

For me everything is a type. Records/structs have type. Map/Dictionary is a type. Functions are type (takes parameters of certain type and  returns a result of certain type) etc. i recommend Elm or purescript to check out if you like front end stuff.


My response was:

https://elm-lang.org/examples/buttons - this does not look like it would play well with my head.  Seems like a bit too many layers of abstraction from what I think of as UI these days :smile:

It makes me think at CarGurus when we used https://immutable-js.com/ I think the benefits of predictability you get with immutability were way outweighed by the lack of syntactic sugar and sometimes the performance implications.  (not helped by the wayyy academic documentation for it)

I mean that's the whole thing with functional programming and UI - almost by definition almost everything interesting a UI does is a side effect -(most often showing something to the user) - the very opposite of what functional programming wants to be

(Getting more deeply into React, it feels more and more rickety - like it just puts MVC into one big mess of render for the view and controller to mess with the model but not TOO much)

I think of this (1998!) quote:

Menu items are the modern programmer's way -- even that of the Java programmer, who is too pure of heart to use pointers -- of putting an obscene number of unpredictable GOTO statements everywhere in his code.

jhayward@imsa.edu in 1998, via Usenet's "rec.humor.funny"


Wednesday, February 14, 2024

sequence diagrams

This is 101 stuff, but: swagger is pretty great at showing you the exposed endpoints and letting you play with them, but it doesn't always provide a good sense of information flow. 

A sequence diagram can help, even if there are just two agents:


sequencediagram.org seems like a fantastic option! It uses a simple text based format (almost akin to markdown) but also lets you do some of the editing right on the chart. (Reminds me of my habit of making a first pass of a UI that's just a big textarea using an adhoc data format for each line- it's rather "expert mode" stuff but can postpone having to make a proper drag and drop or similar tool for reordering blocks, plus you get a convenient saving/sharing mechanism.)

The one slightly hidden bit of that web app is where to get a URL to share (which for me is one of the main attractions for collaboration) - it's the 4th action icon down that implies "export", then it's "URL to Share"/Create. 





Monday, February 12, 2024

sshfs to mount remote file systems on macs!

 Pretty cool way of mounting a remote Unix filesystem: How To Use SSHFS on macOS (maybe it beats previous ideas I had with rsync?)

Friday, February 9, 2024

we are all strange loops

One of my favorite books is Douglas Hofstadter's"I Am a Strange Loop", which picks up some threads from his more famous "Gödel Escher Bach" but also reflects him coping with the death of his soulmate wife. Hofstadter is trying to see if his ability to have conversations with Carol (based on his earlier history of a proven, high-fidelity ability to predict just what she would say) could in a way BE her living on in his head and heart - in a philosophical (and not merely poetic) way, or if that was just a consoling bit of wishful thinking.

In talking about mind and consciousness, he constructs a playful physical metaphor of the "careenium" - a bouncing-magnetized-pinballs thought experiment of how we come to model the world in our own craniums - a model rich enough to include ourselves as a model doing the modeling, and so on and so on.

Googling to try to remember the term "Careenium", I found this page explaining the concept and comparing it to GPT. One challenge you run into if you collaborate with GPT is that's it's not doing a great job of modeling the problem at hand in its virtual head: its model of the world is fairly static, and a conversation with it (as impressive as it is! Especially if you've played with the previous fruits of AI over the past decades) is just a probabilistic word journey through that static space. 

In some ways it's right there in the name: GPT means "Generative Pre-trained Transformer", and the problem is the "Pre" - and earlier "Transfomers" were notably worse at keeping track of what was just said in the dialog.

I guess the implication is if GPT had greatly increased abilities to update its own model on the fly, if that process was more organically bootstrapped and ongoing, it might be a better candidate for "true" Artificial General Intelligence and even consciousness...


Thursday, February 8, 2024

null: the billion dollar mistake

 I tried to look smart today when one of my teammates mentioned realizing he was finding new problems with a data stream because previously some other process was treating null as zero...

It had me hunt down this quote from Tony Hoare on the invention of null:

I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

Heh.

Mac Phantom Messages Badge - how to remove

NEWER UPDATE: 

The trick is to activate Siri on the Mac if she isn't already, and then say "Siri read me my unread messages". It dug out 6 messages from years ago that were for whatever reason not showing up in the Messages app

I found a few partial solutions to my Mac (Ventura 13.4) always showing 6 Unread Messages (none of my other devices showed any) on the Dock but either they weren't doable as listed or didn't work.

I believe the steps were: (after making sure that indeed, all messages seemed to be read)

  1. Quit Messages (after making sure that all messages do indeed seem read - you should be able to go to View | Unread Messages)
  2. Go to System Settings | Notifications
  3. Scroll down to Messages and click to open the panel
  4. Turn off "Badge application icon"
  5. In terminal window, click "killall Dock" (I suspect you might be able to use Force Quit Finder? but I haven't tested that way) The Dock should go away then bounce back.
  6. Start up Messages
  7. Turn "Badge application icon" back on

Good luck!

UPDATE: I lied. At the next message I got the number popped back up to 7. The ghost messages must be living in iCloud land...



Friday, February 2, 2024

the pop, the rise, the fall of tech jobs

 


via Statista - I've really been wondering about this chart, especially hearing how the USA added a surprising chunk of jobs in January.

Tech jobs took a massive hit at the start of quarantine but was back where it was in about a year - and kept on going.

This chart makes it look we're still about about where we might have been otherwise, but this is as of October 2023... and I haven't seen too many signs that the roller coaster back down leveled out or resumed its ascension.