Wednesday, July 17, 2024

everybody's free (to write websites)

By Sara Joy

Enbies and gentlefolk of the class of ‘24:

Write websites.

If I could offer you only one tip for the future, coding would be it. The long term benefits of coding websites remains unproved by scientists, however the rest of my advice has a basis in the joy of the indie web community’s experiences. I will dispense this advice now:

Enjoy the power and beauty of PHP; or never mind. You will not understand the power and beauty of PHP until your stack is completely jammed. But trust me, in 20 years you’ll look back at your old sites and recall in a way you can’t grasp now, how much possibility lay before you and how simple and fast they were. JS is not as blazingly fast as you imagine.

Don’t worry about the scaling; or worry, but know that premature scalability is as useful as chewing bubble gum if your project starts cosy and small. The real troubles on the web are apt to be things that never crossed your worried mind; if your project grows, scale it up on some idle Tuesday.

Code one thing every day that amuses you.

Style.

Don’t be reckless with other people’s data; don’t put up with people who are reckless with yours.

POSSE.

Don’t waste time on shiny new frameworks; sometimes they’re helpful, sometimes they’re a trap. The web platform doesn’t need gigs of node_modules.

Remember the guestbook entries you receive; forget the spam. If you succeed in doing this well, tell me how.

Keep your old site designs. Throw away your old nested <div>s.

Flex.

Don’t feel guilty if you don’t know what you want to do with your site. The most interesting websites don’t even have an introduction, never mind any blog posts. Some of the most interesting web sites I enjoy just are.

Add plenty of semantic HTML.

Be kind to your eyes, your visitors will appreciate a nice theme.

Maybe you’ll blog, maybe you won’t.
Maybe you’ll have users, maybe you won’t.
Maybe you’ll give up that cool domain.
Maybe you’ll sell that little project and hate what the buyers do with it.

Whatever you do, don’t congratulate yourself too much, or berate yourself either. Your code is half spaghetti; so is everybody else’s.

Enjoy your <body>. Style it every way you can. Don’t be afraid of CSS, or what other people think of it. It’s the greatest design tool you’ll ever learn.

Animate, even if you only try it out in your local IDE or CodePen.

 Read the documentation, even if you don’t follow it.

Do not read React dev rel articles; they will only make you feel confused.

Get to know the web platform; HTML, CSS and JS are there for good.

Be nice to your community; they are your hyperlinks that keep the web interconnected and the people who will give the web a future.

Understand that frameworks come and go, but for a precious few you should donate to the maintainers.

Work hard to bridge the gaps in accessibility and responsiveness, because the older you get, the more you need the accommodations you didn’t need when you were young.

Host on Netlify once, but leave before it makes you static.

Host on Überspace once, but leave before it makes you dynamic.

Contribute.

Accept certain inalienable truths: connection speeds will rise, techbros will grift, you too will get old— and when you do, you’ll fantasize that when you were young websites were light-weight, tech founders were noble and fonts used to be bigger.

Respect the W3C.

Ask for help and people will support you.

Maybe you have a patreon, maybe you have venture capital funding; but you never know when either one might run out.

Don’t mess too much with your tabbing order, or by the time you’ve got arthritis, using a keyboard will be useless.
Be careful whose advice you buy, but be patient with those who supply it.

The old web is a form of nostalgia. Rebuilding it needs to be more than fishing the past from the disposal, painting over the inaccessible parts and recycling it for more than it’s worth.

But trust me on the websites.


(It's funny comparing this to another article I ran into today: Why I'm Over GraphQL. I've never used it but it always seemed like a weirdly over (or maybe under-) engineered idea...

Friday, July 12, 2024

free airline wifi! sort of

https://robertheaton.com/pyskywifi/ - using an account info screen as a low-bandwidth but free TCP/IP-ish gateway.

Somehow this reminds me of how "Pickle Rick" in Rick+Morty bootstraps himself w/ sewer dweller pieces and misc. junk into a Barbie sized Terminator warrior...
 

Wednesday, July 10, 2024

state of js

https://2023.stateofjs.com/ -
the gender disparity in the respondent demographics is a little disheartening
the footprint of JS/ECMAscript a dev might choose to use (and should be able to recognize) is growing
I do love this kind of graph, and double axis of "I have used" and "I like it"

 


note to self - figma replacement?

 Note to self, at some point I should check out Penpot - looks like an interesting replacement for Figma.

(To be honest, I think it's a little bizarre that bare Figma starts you with like, just the rawest of shapes, and not much that looks like an actual UI until you start importing a lot of library things.)

Monday, July 8, 2024

urchin

 I kind of like how the semi-ubiquitous UTM code for putting analytics trackers in URL comes from the delightfully named, now defunct, yet perpetually memorialized company "Urchin" - here is a page on their history.

Also I like how they "cleaned up" their charming logo to make it look more professional...



Tuesday, July 2, 2024

KISS

I appreciate Chris Ferdinandi's "Go Make Things". He's a kindred soul to me shaking my head about how needlessly complex web development has gotten. (Maybe I should also take his lead on web components...)

But I really appreciated Evergreen tech is an asset (and dependencies are a liability).
For so many apps and sites, there just isn't that much the front end needs to do that core browser tech doesn't handily provide...

Maybe I should lean on the framing of "evergreen". It almost seems like a contradiction for "tech", but the fact is there are conservative low-turnover parts of tech and edge-y high-turnover parts prone to flavor of the month, and the latter doesn't have as many benefits as many in the industry seem to assume. Someone once said, the Internet just has one big trick, getting information onto and off of someone else's server, and PHP and browser-native tech does that very well, and even in attractive ways if you know how to work it.

Monday, July 1, 2024

1, 2, 3, 4 I declare cyberwar

 I need to reconsider how I'm getting my news headlines because I hadn't heard about the major auto finance industry cyberattack making many dealers fallback to old paper based systems.

Hello from the Middleman Economy quotes Cory Doctorow:

"This is the American story of the past four decades: accumulate tech debt, merge to monopoly, exponentially compound your tech debt by combining barely functional IT systems. Every corporate behemoth is locked in a race between the eventual discovery of its irreparable structural defects and its ability to become so enmeshed in our lives that we have to assume the costs of fixing those defects."
Actually it's probably better just to read that Doctorw piece

 

Thursday, June 27, 2024

humans and robots, working toget

 Interview with the CEO/President of CodaMetrix (my new company)

As a kid, thinking of the computer-y future (and literally wondering when Bill Gates was going to make a program that wrote other programs) I kept up what was mostly a romantic, sci-fi driven hope that the best results were going to be from humans cooperating with computers, rather than either acting alone.

In a lot of ways that's where we are at now. Anecdotally, a lot of programmers are finding ChatGPT and its ilk enormously helpful as a "second pair of eyes" and as a well read but distinctly junior level paired-programmer. Similarly, I'm happy to have a role here on UX/UI at CodaMetrix, enabling pulling humans back in the loop either for specific medical coding issues and for providing informed guidance and oversight on the overall process.


 

Wednesday, June 19, 2024

AI-aiiiiiii

 I Will Fucking Piledrive You If You Mention AI Again - really very thoughtful piece:

The crux of my raging hatred is not that I *hate* LLMs or the generative AI craze. I had my fun with Copilot before I decided that it was making me stupider - it's impressive, but not actually *suitable* for anything more than churning out boilerplate. Nothing wrong with that, but it did not end up being the crazy productivity booster that I thought it would be, because *programming is designing* and these tools aren't good enough (yet) to assist me with this seriously
The thing is, in an age where frontend programming libraries and toolkits have just EXPLODED like mushrooms and we have more "flavors of the month" than a Baskin-Robins in a timelapse, "churning out [customized!] boilerplate" is an enormously helpful thing.

Tuesday, June 4, 2024

when usability misthinks and security nightmares go hand in hand

 

Why is Microsoft so hell bent on making this security nightmare?

Like, even putting the security dangers aside, it's such a weird usability misthink (IMO; I'm sure some segment of users would appreciate it.) Like, you sort of need to embrace the ephemeral nature of day-to-day digital, and take steps to recognize what you want to preserve, and come up with a mechanism and structure that works for you to preserve it. Leaning on the computer playing "Little Big Brother" as a convenience feature is no way to live.

I think of parallel examples from a simpler age: bookmark managers. Every browser would like to be your main bookmark repository, since that increase the browsers value (and "stickiness") to you. But early on, I took the HTML page that Netscape Navigator was using internally to store your bookmarks (yes I'm old) and put that on my rented webspace. (yes I'm an old geek) Then I could use any browser at work or home and do my own conscious curation of what bookmarks were worth keeping.

(As an old geek aside: I am appalled at the universality of linkrot. A Good URL can and should live forever, us old school geeks thought, and I try to live up to that with my personal sites - but this seems to be an increasingly rare approach, and maybe one in fifteen links I have on my old 90s bookmarks page still works)

Similarly, a lot of product lines try to lure users with being able to pick up on one device where another one leaves off - like handing off from a phone's browser to the laptop or vice versa. I'm not a purist against cross-device sharing - I rely on Apple's shared clipboard fairly often - but making a "seamless" handoff seems like a fool's errand to me, and as likely to startle the user as to be helpful - they are different devices with different use modes, and when the need to transfer does occur.... I mean that's what URLs have always been for.

This isn't a black and white issue. There are some kind of "ease of use" features I depend on - like I don't usually need my browser to record my bookmarks, but I DO lean on autocomplete for website URLs pretty heavily, and if i switch to a completely new machine it's a pain in the butt for a few days. But recording all my activity via screengrabs (and recording lots of stuff as plain text?) What a disastrous mixup of "can" and "should", one of the most idiotic paths in the current AI arms race.

css gives html new options for whitespace

Back in the day, one of the hardest things to get used to with HTML was how it treated white space; any linebreak in the source code wouldn't show up in the final page, and in fact its only concept of white space was "none" "a space" or "some linebreak caused by a <br> or <p> tag"

Different content systems aspiring for some flavor of WYSIWYG had to decide how treat carriage returns - the simplest, like what my homebrew CMS for my blog does, is replace "\n" with "<br>\n". More typographically elegant systems would use <p> tags - BUT - there was always some xhtml-days guilt about having to wrap paragraphs in "<p></p>", and then ambiguity about when you wanted a single line break vs a full on paragraph break.

(Also there was always the <pre></pre> tag for preserving ALL line breaks and other white space- but this was like the opposite of responsive design)

But these days CSS offers another good option:

white-space: pre-wrap;

This seems to do a great job of preserving both carriage returns as they are in the text, as well as preserving collections of spaces (where you used to have to use convert &nbsp; to get the same effect). You can use the mere "pre" setting to get back to full <pre>-tag like behavior without having to use non-semantic" tags

I probably can't retrofit my blog with this... too much of my old content was too free with the whitespace in general, but it's good to know about for future projects.

Wednesday, May 29, 2024

o'reilly on building with LLMs

Good article on building with LLMs. Ironically I think ChatGPT or similar is going to be a help for some people unfamiliar with the terms of art used in the article - ("n-shot prompting", "Retrieval-Augmented Generation (RAG") etc.

Heh, I remember when O'Reilly was the first line of defense for coding (I especially loved their "Cookbook" and "Phrasebook" formats.) It shifted to sites like experts-exchange, and then Google + Stackoverflow, and now ChatGPT.


Monday, May 27, 2024

backlog review...

 AI for Web Devs - In this blog post, we start bootstrapping a web development project using Qwik and get things ready to incorporate AI tooling from OpenAI.

Nice intro to WebComponents - I do love the advent calendar format...

https://tiiny.host/web-hosting-free-sites/ or https://glitch.com/ for static hosting of HTML?

WildAgile: just enough process

 It's probably meant half tongue-in-cheek but WildAgile is a minimalist approach to scrum, one that attempts to somewhat codify (or possibly "legitimize") the process many companies kind of land on - a pragmatic (and sprintless) approach.

Saturday, May 25, 2024

on being dense

Smart article on UI Density - I'm a sucker for anything that cites Edward Tufte. 

Its summary quote is:

UI density is the value a user gets from the interface divided by the time and space the interface occupies.

I think that's good at advancing the art, but it might be falling into the trap of only considering things that can be readily quantified. A UI will live or die based on the mental headspace it takes to understand it. A compacted interface can be off-putting because it requires to much external knowledge to grok it, and so just appears overwhelming.

lifehacks to stay afloat

 I've been floundering with email for a few weeks now.


This morning I came up with an idea to try and gate things a little better: no tumblr (my favorite way of keeping up with the memes) til I'm at just about Inbox Zero. We'll see how it goes.


But one of the things that trip me up are these newsletters - Frontend Focus, Javacript Weekly , ui.dev's bytes. Like on the one hand I'm grateful for these, 5 or 6 years ago I was really feeling out of the loop where frontend tech was heading - React sort of crept on to me - and now I feel much better informed.
But I still am disheartened, and a bit suspicious of the extreme complexity flavor of the month stuff. I think this video puts it kind of well: 

Tuesday, May 21, 2024

just trust us (i.e. our Bot)

 
Google is pivoting to REALLY leaning into preferring AI summaries of information rather than traditional web links and snippets. This article is a workaround that (at least for now) makes it easier to stick to web results.

But this trend so infuriating! Google made its mark with a brilliant "BackRub" algorithm: banking on the idea that a source is more useful and trustworthy based on how many OTHER sites refer to it. That's not Truth, but it's a good first order approximation.

This throws that out the window. They are putting all their eggs in a basket of how "people just want a simple answer" (As Nolan Bushnell puts it, "A simple answer that is clear and precise will always have more power in the world than a complex one that is true") and so replacing its history of "trust us, we'll show you who you can trust!" to a simplistic "just trust us (i.e., our Bot)"

And sure, knowing whom to trust online has always been as much an art as a science - people have to develop their own nose (starting with their own preconceptions) using the content and (for better or worse) the design and presentation of a site. (And while not infallible, I think it shows the wisdom of Wikipedia's approach - insist on citations, and let knowledgeable parties slug it out. Of course conservatives suspect it has a slant - but progressives tend to think of Colbert's "It is a well known fact that reality has liberal bias") But the "blurry JPEG" that is AI causes information to lose its flavor, all piped through the same Siri- or Alexa- or Google-Assistant- friendly stream of words.

And when the AI is wrong - boy there are some anecdotes out there. The one about what to do about a Rattlesnake bite is a killer. Possibly literally. It's almost enough to make one hope for some huge punitive lawsuits.

There is a weird "Idiot Ouroboros" aspect to Google's pivot away from connecting people to other knowledge sources - the AI has its knowledge base from what was gleaned from the web. And now the incentives for building up a reputable parcel on the wider information landscape fades away, and eventually the whole web starts to look like those dark corners of social media where Spambots try to pitch their wares to fake user account bots, endlessly.

Saturday, May 18, 2024

draw a line anywhere on a page!

Two quick points:
They released GTP-4o. The o stands for "optimized", though based on a few random tech problems I through at it I'm thinking it stands for "oh, not as good as 4" - I think they tuned it to be fast and take more types of inputs in in a fluid way, but for my main use (programmer's lil' helper) the web version provides lesser results

Anyway, 4 had better suggestions for my problem where I wanted to draw arbitrary lines on page (in my case I'm revamping my "poll editor", and I realized it would be great if the element used to set up an individual poll entry could point to the appropriate place on the preview.)

Enter LeaderLine! To rip off that page:

I love it - if your page is dynamically reformatting you might have to call "line.position()" to reset things and there were a few cases (like when the page was distorted from a textarea being weirdly resized) where it wasn't 100% perfect, but I think it nails most usescases and meets my criteria for using a library vs writing bespoke code:
1. it pulls its weight by solving a relatively difficult problem
2. it has a reasonable simple API
3. it works with vanilla.js without a build system


Friday, May 17, 2024

BASIC and this one weird trick with line numbers

Happy Birthday BASIC - it just had its 60th birthday.

Sometimes I say Perl taught me 5 big things, after taking up C in college: maps, first class strings (vs C character arrays), duck typing, regular expressions, and not having to micromanage memory. But really, BASIC was my exposure to 3 of those things!

Famously Dijkstra said

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

Pretty rough talk! Though he may have been talking about a particularly primitive flavor of it.

Line numbers are probably the thing most reviled - certainly they don't encourage modular thinking the way named subroutines do, but as my friend Jeremy "SpindleyQ" Penner pointed out, they are at least a good way to getting new (and young!) programmers to understand the step-by-step thinking that is critical to writing programs. (I remember seeing a listing of Amiga BASIC and was blown away that it had no line numbers! Also it reminds me that 8-bit computers had weird flavors of "full screen editing" - line numbers were a way of managing the program listing structure in an age before text editors were always available.)

One trick you heard while learning BASIC in the 80s - number by 10s, so that you can easily insert code before or after a specific lines. Weirdly this came in handy when I was creating my 50th Birthday comic book - it started as a bunch of standalone thoughts, which I turned into comic panels (and later grouped into 4 sections). I had a rough idea of the ordering, and so I started each file name with a 3 digit multiple of ten ("010-BEGINS-TO-BAND.png", "020-FIXED-MINDSET.png") etc, so that the filesystem could easily show me the ordered list.  (I'm not sure if writing in all caps was another nod the old 8-bit computer days.)

Wednesday, May 15, 2024

uh-oh is chatgpt slipping

 ChatGPT has proven itself a valuable programming partner but with the new version when I asked it to make a React component and it started:

import React, { useState } from 'recat'; import styled from 'styled-components';

"recat"? That does not bode well...

Wednesday, May 1, 2024

remember to not require authentication for OPTIONS requests on your authenticated endpoints!

Recently I was on a project where we started using "bearer tokens" (like bearer bonds, bearer tokens means if you have possession of it, you're trusted)

But we were getting CORS errors - but they read like cross-origin issues?And the request with the bearer token would go ahead and work when exported as cUrl commands....

Cut to the chase, the browser was sending an OPTIONS request to sniff around to the "Access-Control-Allow-Origin" header - but accidentally that was set to require the token - but that's not the way OPTIONS are supposed to work, so the browser doesn't include it. Anyway, so the 401 rejection of the OPTIONS request meant that the browser wouldn't try the actual GET. 

A little frustrating. It's like "c'mon browser - live a little, send the request, let the server figure out if it doesn't want to talk to you" -- sort of like giving a pep talk when you're playing wingman for your single friend at a bar.

coors = mediocre beer, CORS = annoying security gotchas

chatgpt as lossy compression

My favorite sci-fi author Ted Chiang wrote ChatGPT is a blurry jpeg of the web:
Imagine what it would look like if ChatGPT were a lossless algorithm. If that were the case, it would always answer questions by providing a verbatim quote from a relevant Web page. We would probably regard the software as only a slight improvement over a conventional search engine, and be less impressed by it. The fact that ChatGPT rephrases material from the Web instead of quoting it word for word makes it seem like a student expressing ideas in her own words, rather than simply regurgitating what she's read; it creates the illusion that ChatGPT understands the material. In human students, rote memorization isn't an indicator of genuine learning, so ChatGPT's inability to produce exact quotes from Web pages is precisely what makes us think that it has learned something. When we're dealing with sequences of words, lossy compression looks smarter than lossless compression.
Admittedly this was written last year but I think he underestimates the usefulness of ChatGPT in applying knowledge to a particular case at hand:
This analogy makes even more sense when we remember that a common technique used by lossy compression algorithms is interpolation--that is, estimating what's missing by looking at what's on either side of the gap. When an image program is displaying a photo and has to reconstruct a pixel that was lost during the compression process, it looks at the nearby pixels and calculates the average. This is what ChatGPT does when it's prompted to describe, say, losing a sock in the dryer using the style of the Declaration of Independence: it is taking two points in "lexical space" and generating the text that would occupy the location between them. ("When in the Course of human events, it becomes necessary for one to separate his garments from their mates, in order to maintain the cleanliness and order thereof. . . .") ChatGPT is so good at this form of interpolation that people find it entertaining: they've discovered a "blur" tool for paragraphs instead of photos, and are having a blast playing with it.
I'm willing to grant that asking ChatGPT to apply its embedded gleaned knowledge to a particular problem is basically that kind of of interpolation, but in practice it is far more useful than making entertaining mashups. In my case, especially for technical tasks - as I previously quoted David Winer:
ChatGPT is like having a programming partner you can try ideas out on, or ask for alternative approaches, and they're always there, and not too busy to help out. They know everything you don't know and need to know, and rarely hallucinate (you have to check the work, same as with a human btw). It's remarkable how much it is like having an ideal human programming partner. It's the kind of helper I aspire to be.

Interesting world.

Monday, April 29, 2024

don't be afraid to duplicate

 "I generally follow the rule. Duplicate code until you have at least three examples. Then you can generalise.

So many times if you dedupe code which appears in two places that at first looks like the same code you later realise it is different behaviour and make the "general" function much more complex."

--Clair Blackshaw

Interesting to pair that with my general sense of the kneejerk tendency of some folk to look for a library rather than write a little bespoke code. A good library solves MANY scenarios at once, but since the whole point is kind of NOT understanding the solution as deeply (or taking a lot of time to learn new language of configuration) if it does go wrong you're likely to have less clear sight into how to dig yourself out...

Monday, April 22, 2024

the chicken and the pig

Once upon a time I had official scrum master training with Ken Schwaber.

Besides the general scrum knowledge I acquired some details have stuck with me, like the bill-shaped duck call he would use to get peoples' attention. (He said people were less likely to steal it.)
 

Also this introductory line which seems oddly belligerent, but reflects the Scrum folks' faith that they had the better idea (and indeed they did make the new standard, even if few places practice the pure version) is this:

You suck... and that makes me sad.

Also I remember hearing this story, not quite sure if this exactly the version but:

A man walks into Fat Burger and orders a Double Fatburger, fries, and a drink.

Man only has $3.15 but the total comes to $7.15. The manager tells him he’s going to remove something from his order.  But the man insisted to have it all.

The manager doesn’t want to lose the customer so he walks out and finds a dead squirrel off the street.  He makes the burger by cooking the squirrel and putting it on a bun and hands it over to the man.

So if you draw the analogy of the story with the scenario above, it clearly seems that team compromises the quality just to deliver the product on time.

In order to achieve the unrealistic deadlines, first thing teams do is to discard the automated tests and stops refactoring the code. Soon after their code resembles coding they did in high school and they are making a huge mess.

But mostly I remember the metaphor of the chicken and the pig:

 

(Here's Vizdos' page on the origin of the cartoon)

The metaphor was that developer are the "pigs" whose bacon is on the line, so to speak, while the other people involved were "chickens" without skin in the game, and so should be quiet observers during the daily standup, for instance.

OK, for one thing, that is a WEIRD metaphor. Way back when I sketched out a different final panel:

(Ken Schwaber was amused by the panel and asked to keep it.)

But that really tied into my problem with the metaphor; Product Owners and other non-devs DO have skin in the game, their jobs and reputations are at stake as well, and in some ways it's even tougher for them because they are dependent on devs and can't just "work harder" to get better results. (also true Scrum aims to guarantee predictability over time, and has relatively little to say about efficiency and timeliness.  As my team lead Steve Katz put it: "the process isn't about not getting fired")

I guess they've moved on from the chicken/pig metaphor anyway - it was a little too joke-y, and I think other people shared my view that non-dev stakeholders are still critical to the success of a project.


Friday, April 19, 2024

humane/her

Marques Brownlee reviews the Humane AI Pin... "the worst product I've ever reviewed... for now"


The form factor reminds me of this chef's kiss detail from the movie "Her" - Samantha's form factor is basically that of a foldable smartphone, but Twombly uses a safety pin to give her a boost in his pocket so she can see the world:

 (Also Brownlee had the clapback of the year to someone arguing he shouldn't have been so negative about a new striving-to-be-innovative product - "We disagree on what my job is")

 

 

on leadership

the biggest threat facing your team, whether you're a game developer or a tech founder or a CEO, is not what you think

Brilliant article on leadership. It's long and gets into the weeds of the games industry, but there is a lot that is true for the whole corporate world.

It touches on one point that is much on my mind: so much of our corporate leadership is "make number go up" (immediately! but then also forever.) Corporations generally have a legal obligation to "increase shareholder value", and in general that's on a per quarter basis. Sustainability and long term viability are afterthoughts at best.

The article points out there's parallels in that and some USA policy decisions in Vietnam:
But when the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fo[u]rth step is to say that what can't be easily measured really doesn't exist. This is suicide.
But then when you combine that with leaders who view themselves as capable of finessed big picture and aesthetic decisions as, say, Steve Jobs... well, they aren't always looking to the people reporting to them as potential Jony Ives - they want to go on their own guts.

So an organization has to thread the needle between "it only counts if it can be quantified" and "it only counts if it has good 'gut feel' to topmost leadership". I think you do that by building and then trusting the expertise of the people in the middle.

Tuesday, April 16, 2024

from "Headcrash"

"I mean." Uberman cleared his throat, adjusted his necktie, and began delivering his morning whine, which is clearly what he'd been intending to do all along. "This is, what? The third network outage this year?"

I stopped. "We're having some problems porting your database to our server, sir." I edged one step closer to the exit.

"I mean," Uberman scowled, "if I can't depend on your network, I'm screwed. Just totally screwed, you know?"

Then how come you're not smiling? is what I thought, but "We'll have it back up as soon as possible," is what I said.

"I mean," Uberman whacked his PC with his newspaper again, "we never had problems like this before MDE acquired us. Dammit, our old Applied Photonics network never crashed! Not once!"

"So I've heard." And heard, and heard, and heard! And if you gave me just sixteen users in a one-floor office, I could make this network look pretty good, too.

Bruce Bethke, "Headcrash"
"Headcrash" is kind of a no-account cyberpunk-y book from the mid-90s... the technobabble is pretty clumsy, but for some reason this passage has stuck with me for 20 years so I thought I'd post it - from time to time, its reminder that little toy systems can get away with things that projects you want to scale can't is useful.

Thursday, April 4, 2024

ux ideas!

 Some UX humor ideation from Sorn Iverson ...
Some of my favorites:





easy drag-and-drop sorting/reordering with no build system vanilla js using SortableJS

I've already linked to a defense of making webapps without a build system - it really is great for long term sustainability and updates, and PHP + Vanilla JS are actually really powerful tools these days. You can even build dynamic page parts in a declarative style, using JSX looking string templates: 

const list = document.getElementById("myList");
list.innerHTML = items.map((item)=>`<li>${item}</li>`).join("");

It's all right there in the browser these days!

One thing I didn't know how to do is add drag-and-drop reordering - at least not in a way that was mobile-friendly. (The browser's Drag and Drop API is one place where the usual abstraction between computers with mice or touchpads and mobile devices with touch screen breaks down.)

I knew of SortableJS but the README didn't make it clear that it works without a build system. ChatGPT straightened me out - using SortableJS is just a <script> tag away:

<script src="https://cdn.jsdelivr.net/npm/sortablejs@1.14.0/Sortable.min.js"></script>

(Hmm, there's no authentication checksum so you might want to grab the file and move it locally - I mean in general eliminating external dependencies is a good practice, tying into that long term sustainability: linkrot is real and ubiquitous, even well meaning CDNs broke projects depending on them as stuff switched from http to https...)

But back to sortable! Here's about the simplest example:
<script src="Sortable.min.js"></script>
<ol id="items-list">
    <!-- List items will be dynamically inserted here -->
</ol>
<script>
document.addEventListener('DOMContentLoaded', () => {
    const items = ["Apple", "Banana", "Cherry", "Date"];

    const listElement = document.getElementById('items-list');
    listElement.innerHTML = items.map(item => `<li>${item}</li>`).join('');

    new Sortable(document.getElementById('items-list'), {
        animation: 150, // Animation speed (milliseconds)
        onEnd: function (evt) {
            items.splice(evt.newIndex, 0, items.splice(evt.oldIndex, 1)[0]);
        }
    });
});
</script>


You can see it in action - everything is right there in the source.

So one slightly wonky aspect is that this kind of reverses the usual flow of declarative programming: Sortable update the DOM and then relies on the onEnd to make your internal state match. Still, a small price to pay for a mobile-friendly reordering solution; so much better than providing little /\ and \/ buttons!

You can go a little further and add some grabbable handles - here's how to make those three lines:
<span class="drag-handle">&#x2630;</span>

Add the reference to the Sortable options object:
handle: ".drag-handle",

and then a little CSS:
  ul {
   list-style-type: none;
   padding-left: 0;
 }
 .drag-handle {
    cursor: grab; /* Changes the cursor to indicate movability */
    padding-right: 10px; /* Spacing between the handle and the text */
    user-select: none; /* Prevents text selection */
  }

  /* Style the drag handle on hover and active states for better feedback */
  .drag-handle:hover, .drag-handle:active {
    cursor: grabbing;
  }


Here's a working example of that.

Finally, if you DO rerender the list, you should probably hold onto the returned Sortable object and call .destroy() and then reapply. That's obviously more tied into whatever app you're actually making, but here is a basic destroy/rerender example as well, which is a bit closer to my usecase.

Wednesday, April 3, 2024

robohorn

As a kid I wondered if you could make a robot trumpet player. The answer is now yes. I wonder how the cyber-embouchure works... (you can google up a robot sax player as well...)

Related Diesel Sweeties comic 10:



dangerous times

A while back I posted I’m harvesting credit card numbers and passwords from your site. Here’s how. a fake (or was it) description of how the overwhelming amount of npm-ish dependencies can make your webapp vulnerable, if a bad actor makes a helpful looking tiny utility (that the framework you like uses, even if you don't see it as worthwhile) and covers their tracks well. 

What we know about the xz Utils backdoor that almost infected the world is along the same lines, except i can't preach the gospel of "use fewer dependencies!"

None of this is new - the seminal Reflections on Trusting Trust - where a trojan could be snuck into a C compiler, covering its tracks all along the way - is 40 years old. But it's scary.

Related: You Are All On The Hobbyists Maintainers' Turf Now. The business world has absolutely embraced the Open Source paradigm - or at least decided to take freely of its fruit, and so the risk of poisoned flowers is there, even as more and more we depend on the good will "doing it for the reputation and to scratch my own itch" work of fewer and fewer - or as XKCD put it:




Monday, April 1, 2024

ObHack: see if I've hit quota

 I know I've mentioned Usenet's alt.hackers and "ObHacks" before... the latest is this:

My main VPS is pretty old (I'm migrating to a new one) and it doesn't do a good job of letting me how close to my disk quota I am - but when I start having problems, the clearest telltale is that if I try writing to a file from terminal, the file is made but is zero bytes.

But on that same server I run my personal start page, so many times during the day I'm going back to that site. So now the script that generates that page tries does this:

// QUOTA PROBLEM CHECK
// Get the current Unix timestamp
$timestamp = time();
// Write it to a file named "timetest.txt"
file_put_contents("timetest.txt", $timestamp);
// Initialize $bodyclass with an empty value
$bodyclass = "";
// Open the file and read the timestamp
$readTimestamp = file_get_contents("timetest.txt");

// Check if we can open the file, if it's not empty, and if the contents match
if ($readTimestamp === FALSE
        || $readTimestamp == ""
        || $readTimestamp != $timestamp) {
    $bodyclass = "alert";
}

and I made the body.alert CSS to be bright red, so I'll see that something is up.


Heh, that piece of PHP reminds me of a thought I had recently; in general I don't adore TypeScript because while I like having JSON arguments described, there are better ways to do that, I find the syntax makes code a bit harder to read, and also there's more of a chance of a false sense of security, since you don't REALLY know what types are going to come back from a given endpoint at runtime.

My observation with that is that 90% of the typing issues I *do* have in JS would go away if "+" always meant addition and not string concatenation. Which is how PHP does it. But then I realized that PHP only gets away with supporting both $string.$concatenation and $object.key lookup syntax because it prefixes its variables with "$", otherwise you'd have to do object["key"] only since object.key would look like a lookup reference.

Monday, March 25, 2024

baby mock json endpoint server and cloning endpoints

My boss thought it strange that we didn't already have a "mock server" so that we could keep doing UI work in case the endpoints went down.

He suggested using json-server but that wasn't made to emulate a rich set of endpoints, just one level deep, either GETing all the entries or just one at a time by id. Luckily that kind of server is the easy part of the assignment.

The first part was to make a file "urls.txt", with just the relevant part of the endpoint...

then this script load-db.js hits each of those entries and writes the content to a file in mocks/

const express = require("express");
const fs = require("fs");
const path = require("path");
const app = express();

// Ensure the mocks directory exists
const mocksDir = path.join(__dirname, "mocks");
if (!fs.existsSync(mocksDir)) {
fs.mkdirSync(mocksDir);
}

// Read the command line argument for the prefix
const prefix = process.argv[2];

// Read the URLs from urls.txt
const urlsFilePath = path.join(__dirname, "urls.txt");
const urls = fs.readFileSync(urlsFilePath, "utf8").split("\n");

// Function to make a safe filename from a URL path and append .json
const makeSafeFilename = (urlPath) => encodeURIComponent(urlPath.replace(/^\//, "").replace(/\//g, "_")) + ".json";

// Fetch content and save it as JSON
urls.forEach(async (urlPath) => {
if (!urlPath.trim()) return; // Skip empty lines
const content = await fetch(`${prefix}${urlPath}`).then((res) => res.json());
const safeFilename = makeSafeFilename(urlPath);
fs.writeFileSync(path.join(mocksDir, safeFilename), JSON.stringify(content, null, 2));
console.log(`Saved content from ${urlPath} to ${safeFilename}`);
});

Then the server.js just looks like this:

const express = require("express");
const fs = require("fs");
const path = require("path");
const cors = require("cors");
const app = express();

app.use(cors());

const mocksDir = path.join(__dirname, "mocks");

app.get("/", (req, res) => {
fs.readdir(mocksDir, (err, files) => {
if (err) {
console.error(err);
return res.status(500).send("Server error");
}

const links = files
.filter((file) => file.endsWith(".geojson") || file.endsWith(".json"))
.map((file) => {
const decodedFilename = file.replace(/\..+$/, "").replace(/_/g, "/");
const encodedPath = decodedFilename
.split("/")
.map((part) => encodeURIComponent(part))
.join("/");
return `<li><a href="/${encodedPath}">${file}</a></li>`;
})
.join("");

res.send(`<ul>${links}</ul>`);
});
});

app.get("*", (req, res) => {
const safeFilename = encodeURIComponent(req.path.replace(/^\//, "").replace(/\//g, "_"));
const filePath = path.join(mocksDir, safeFilename);

if (fs.existsSync(`${filePath}.geojson`) || fs.existsSync(`${filePath}.json`)) {
res.type("application/json");
res.sendFile(fs.existsSync(`${filePath}.geojson`) ? `${filePath}.geojson` : `${filePath}.json`);
console.log(req.path);
} else {
res.status(404).send("Not Found");
console.error(`404 ${req.path}`);
}
});

// Use environment variable for port or a default value
const port = process.env.PORT || 3000;

app.listen(port, () => {
console.log(`Mock server listening at http://localhost:${port}`);
});




package.json is

{
"name": "mock-server",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"start": "node server.js"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"cors": "^2.8.5",
"express": "^4.18.3",
"node-fetch": "^3.3.2"
}
}



(Sigh, I suppose if I was feeling ambitious I should make my a little project out of this and put it on github, but I'm sure someone has, and better.)

UPDATE: this version now lists the endpoint/files it knows about if you go to the root, and listens to process.env.PORT for instructions on port

Saturday, March 16, 2024

jQuery revisited

It's funny, M-W I ran myself a little ragged substitute teaching each night from like 5-10:30.

But Wednesday it was a lesson in jQuery - obviously the curriculum is a little long the tooth (pointing out yes, jQuery is passe but still shows up on like 70-80% of websites) but it was like revisiting an old friend you hadn't seen in ages.


Nowadays browsers are pretty much standardized, and the "fetch" library covers AJAX pretty well - two of the sweet spots jQuery fixed - But to think jQuery was making both easy decades ago! For a long time I was a big fan of https://youmightnotneedjquery.com/ which showed the Vanilla equivalent of most of the important jQuery uses - though the jQuery syntax is generally cleaner. Honestly, jQuery's syntax for animating an item is still much more intuitive than CSS Animations if the situation is imperative and not the result of a screen scroll or what not. (I learned CSS hand-in-hand with jQuery and they were like the perfect complements to take in together)


But try and argue that "hey PHP and jQuery were/are pretty good" and they laugh you out of the room. (Or more realistically, make a mental note that you may be too old and out of it to hire) Ignoring the counter vibe that "oh maybe rendering everything on the client for the last ten years was a mistake" - but server side rendering (in PHP) and dolling things up as needed (in jQuery, or Vanilla JS) is pretty fast and effective AND stable - I have websites that keep running and running with zero maintenance needed, but when it's time to add a feature, that's pretty straight forward as well. (Hell, one of the young punk frameworks is "htmx" - its big trick is stuffing html bits from the server right into the DOM.  Which has been a single command in jQuery

$( "#resultdiv" ).load( "ajax/test.html" );
 

for years and years.)

And of course there's always Pieter Levels sticking to a PHP and JQuery stack for multiple sites that each pull in tends of thousands per month. 

Heh, maybe jQuery needs its own Taking PHP Seriously page for me to link to when I'm feeling defensive - well, no i do stick with Vanilla.js over jQuery - but it's not that much better, except it feels great to not have the dependency.

Thursday, March 14, 2024

the 3-minute programmer

My partner Melissa got us a cool game - OuiSi. Mostly it's a bunch of very pretty, somewhat abstract photo cards, and then the book that comes with it lists different games, some creative, some a bit more competitive.

One is cooperative  yet competitive- "OuiSi Capture" - it plays a bit like "Codenames", the group lays out a 5x5 grid of cards, and each round one person is the designated "clue-giver" and picks one word to describe exactly 5 photos. The other players try to guess the five - each correct card is a point for the communal pile, each wrong guess gives 2 points to the penalty pile - first pile to 25 wins. 

The game suggests arranging 5 coins behind a screen to help remember which cards the clue giver picked, but I thought it would be a little easier to have a webpage to run on a laptop to record the picks. Melissa didn't want to wait for me to program, but I promised her I could get it done in 3 minutes...

ChatGPT to the rescue!

The prompt:

give me an html page with 5 by 5 thing of regular pushbuttons. Each button is a square. Caption for each starts with "_". When it's click it toggles back and forth to "X" and "_"
 
Here's the result! Worked like a charm

Wednesday, March 6, 2024

vanilla, it's the finest of the flavors

 What is Utility-First CSS? - ripping into what is almost certainly a terrible paradigm for CSS modeling.

With so many frameworks and toolkits out there, it feels like a lot of people haven't noticed that Vanilla JS and CSS have gotten pretty good and are certainly reasonable choices for many tasks. I think the secret sauce is browsers that embrace some brilliant emergent standards - and also helped by browsers that update themselves, unlike the bad old days of wondering what version of IE you had to support. (And for better or worse the number of rendering engines has dramatically diminished, so bad quirks are even less common.) Related: A Manifesto for Small Static Web Apps

Monday, March 4, 2024

more on the early prototype

My manager encourages us to look to Steve Jobs for inspiration, and I was surprised I hadn't posted about some iPod and iPhone prototypes that were making the rounds a while back - one is this beauty of a breadboard mockup for the first iPod:


 (I think the screen is about the size of what they had on the first production unit, which gives you an idea of what an absolute unit this is)

The other is competing prototypes for what would run on iPhone - when this video dropped people were surprised that a wheel-based concept was in the running. (But remember how psyched people were that iPhone ran a flavor of OSX? Now it seems inconsequential or obvious, like any gadget running a stripped down version of the linux kernel...)

Thursday, February 29, 2024

ipod click and skip

Some hidden history of the iPod. My boss digs Steve Jobs' attitudes about excellence - and an aptitude for taking resources at hand (in the case of the iPod, a new small Toshiba hard drive) and applying them in novel ways.

Two things I hadn't heard much about:

  1. the signature click wheel has heavily drawn from a phone, the Bang & Olufsen BeoCom 6000
  2. Part of the secret sauce was a large 32Mb "skip buffer" - advertised as "20 minute skip protection" (remember this is an age of jostled portable CD players leading to poor experience) its true purpose was buffering of songs, so the device could load a few songs at once rather than have the little hard drive constantly spinning, and so tripling the battery life to meet critical performance metrics.

Wednesday, February 28, 2024

dutch tulips anyone?

Really excellent overview charting the first dot com bust through to the current dreams of AI as the savior for tech - going to copy/paste it here for posterity...

(I would say too, it's really bad the only micro-/nano-transaction model we came up with for the Internet was ads and all the privacy breaking and other nonsense that comes up with.)

If anyone wants to know why every tech company in the world right now is clamoring for AI like drowned rats scrabbling to board a ship, I decided to make a post to explain what's happening.


(Disclaimer to start: I'm a software engineer who's been employed full time since 2018. I am not a historian nor an overconfident Youtube essayist, so this post is my working knowledge of what I see around me and the logical bridges between pieces.)


Okay anyway. The explanation starts further back than what's going on now. I'm gonna start with the year 2000. The Dot Com Bubble just spectacularly burst. The model of "we get the users first, we learn how to profit off them later" went out in a no-money-having bang (remember this, it will be relevant later). A lot of money was lost. A lot of people ended up out of a job. A lot of startup companies went under. Investors left with a sour taste in their mouth and, in general, investment in the internet stayed pretty cooled for that decade. This was, in my opinion, very good for the internet as it was an era not suffocating under the grip of mega-corporation oligarchs and was, instead, filled with Club Penguin and I Can Haz Cheezburger websites. 


Then around the 2010-2012 years, a few things happened. Interest rates got low, and then lower. Facebook got huge. The iPhone took off. And suddenly there was a huge new potential market of internet users and phone-havers, and the cheap money was available to start backing new tech startup companies trying to hop on this opportunity. Companies like Uber, Netflix, and Amazon either started in this time, or hit their ramp-up in these years by shifting focus to the internet and apps.


Now, every start-up tech company dreaming of being the next big thing has one thing in common: they need to start off by getting themselves massively in debt. Because before you can turn a profit you need to first spend money on employees and spend money on equipment and spend money on data centers and spend money on advertising and spend money on scale and and and


But also, everyone wants to be on the ship for The Next Big Thing that takes off to the moon.


So there is a mutual interest between new tech companies, and venture capitalists who are willing to invest $$$ into said new tech companies. Because if the venture capitalists can identify a prize pig and get in early, that money could come back to them 100-fold or 1,000-fold. In fact it hardly matters if they invest in 10 or 20 total bust projects along the way to find that unicorn. 


But also, becoming profitable takes time. And that might mean being in debt for a long long time before that rocket ship takes off to make everyone onboard a gazzilionaire. 


But luckily, for tech startup bros and venture capitalists, being in debt in the 2010's was cheap, and it only got cheaper between 2010 and 2020. If people could secure loans for ~3% or 4% annual interest, well then a $100,000 loan only really costs $3,000 of interest a year to keep afloat. And if inflation is higher than that or at least similar, you're still beating the system. 


So from 2010 through early 2022, times were good for tech companies. Startups could take off with massive growth, showing massive potential for something, and venture capitalists would throw infinite money at them in the hopes of pegging just one winner who will take off. And supporting the struggling investments or the long-haulers remained pretty cheap to keep funding. 


You hear constantly about "Such and such app has 10-bazillion users gained over the last 10 years and has never once been profitable", yet the thing keeps chugging along because the investors backing it aren't stressed about the immediate future, and are still banking on that "eventually" when it learns how to really monetize its users and turn that profit.


The pandemic in 2020 took a magnifying-glass-in-the-sun effect to this, as EVERYTHING was forcibly turned online which pumped a ton of money and workers into tech investment. Simultaneously, money got really REALLY cheap, bottoming out with historic lows for interest rates. 


Then the tide changed with the massive inflation that struck late 2021. Because this all-gas no-brakes state of things was also contributing to off-the-rails inflation (along with your standard-fare greedflation and price gouging, given the extremely convenient excuses of pandemic hardships and supply chain issues). The federal reserve whipped out interest rate hikes to try to curb this huge inflation, which is like a fire extinguisher dousing and suffocating your really-cool, actively-on-fire party where everyone else is burning but you're in the pool. And then they did this more, and then more. And the financial climate followed suit. And suddenly money was not cheap anymore, and new loans became expensive, because loans that used to compound at 2% a year are now compounding at 7 or 8% which, in the language of compounding, is a HUGE difference. A $100,000 loan at a 2% interest rate, if not repaid a single cent in 10 years, accrues to $121,899.  A $100,000 loan at an 8% interest rate, if not repaid a single cent in 10 years, more than doubles to $215,892.


Now it is scary and risky to throw money at "could eventually be profitable" tech companies. Now investors are watching companies burn through their current funding and, when the companies come back asking for more, investors are tightening their coin purses instead. The bill is coming due. The free money is drying up and companies are under compounding pressure to produce a profit for their waiting investors who are now done waiting. 


You get enshittification. You get quality going down and price going up. You get "now that you're a captive audience here, we're forcing ads or we're forcing subscriptions on you." Don't get me wrong, the plan was ALWAYS to monetize the users. It's just that it's come earlier than expected, with way more feet-to-the-fire than these companies were expecting. ESPECIALLY with Wall Street as the other factor in funding (public) companies, where Wall Street exhibits roughly the same temperament as a baby screaming crying upset that it's soiled its own diaper (maybe that's too mean a comparison to babies), and now companies are being put through the wringer for anything LESS than infinite growth that Wall Street demands of them. 


Internal to the tech industry, you get MASSIVE wide-spread layoffs. You get an industry that used to be easy to land multiple job offers shriveling up and leaving recent graduates in a desperately awful situation where no company is hiring and the market is flooded with laid-off workers trying to get back on their feet. 


Because those coin-purse-clutching investors DO love virtue-signaling efforts from companies that say "See! We're not being frivolous with your money! We only spend on the essentials." And this is true even for MASSIVE, PROFITABLE companies, because those companies' value is based on the Rich Person Feeling Graph (their stock) rather than the literal profit money. A company making a genuine gazillion dollars a year still tears through layoffs and freezes hiring and removes the free batteries from the printer room (totally not speaking from experience, surely) because the investors LOVE when you cut costs and take away employee perks. The "beer on tap, ping pong table in the common area" era of tech is drying up. And we're still unionless. 


Never mind that last part.


And then in early 2023, AI (more specifically, Chat-GPT which is OpenAI's Large Language Model creation) tears its way into the tech scene with a meteor's amount of momentum. Here's Microsoft's prize pig, which it invested heavily in and is galivanting around the pig-show with, to the desperate jealousy and rapture of every other tech company and investor wishing it had that pig. And for the first time since the interest rate hikes, investors have dollar signs in their eyes, both venture capital and Wall Street alike. They're willing to restart the hose of money (even with the new risk) because this feels big enough for them to take the risk.


Now all these companies, who were in varying stages of sweating as their bill came due, or wringing their hands as their stock prices tanked, see a single glorious gold-plated rocket up out of here, the likes of which haven't been seen since the free money days. It's their ticket to buy time, and buy investors, and say "see THIS is what will wring money forth, finally, we promise, just let us show you."


To be clear, AI is NOT profitable yet. It's a money-sink. Perhaps a money-black-hole. But everyone in the space is so wowed by it that there is a wide-spread and powerful conviction that it will become profitable and earn its keep. (Let's be real, half of that profit "potential" is the promise of automating away jobs of pesky employees who peskily cost money.) It's a tech-space industrial revolution that will automate away skilled jobs, and getting in on the ground floor is the absolute best thing you can do to get your pie slice's worth.


It's the thing that will win investors back. It's the thing that will get the investment money coming in again (or, get it second-hand if the company can be the PROVIDER of something needed for AI, which other companies with venture-back will pay handsomely for). It's the thing companies are terrified of missing out on, lest it leave them utterly irrelevant in a future where not having AI-integration is like not having a mobile phone app for your company or not having a website. 


So I guess to reiterate on my earlier point: 


Drowned rats. Swimming to the one ship in sight. 


 

Wednesday, February 21, 2024

AI-yi-yi

 

Subprime Intelligence - Thoughtful piece about what is the strength of the boom in AI - especially as we may be nearing the top of an S-shaped curve of capabilities.

I think it's a great point that for much of the generative stuff, the "killer app" isn't there (though there are a lot of small use cases, and LLMs are certainly a boon to helping software developers navigate the thick woods of toolkits and languages we've created.)

It will also be interesting to see how the public's "AI-Radar" increases, and how much people start to notice and possibly get annoyed at the garish flatness of it all.

But especially the economics... hearing how Microsoft's "investment" in OpenAI was more like a donation of computer time, and other investments seem more about tech giants trying to ensure they're the ones selling the raw cloud horsepower... sometimes you wonder where we are all going with this.

catching up on newsletters

Catching up on frontend newsletters...

  •  I'm always inpressed by The Coding Train - I love p5/processing and these videos seem like a great way to learn programming.
  • This rise and possible fall of the square checkbox was a fun visual history.
  • The state of the states...
  • Back in the caveman days of Perl CGI, it sort of made sense that I used the same language for quick and dirty shellscripts and web programming.. but now that my go to is PHP for the latter (at least for the serverside)? Having a command line script inside <?php ?> always feels weird. Maybe I should switch to Bun for that?
  • Web Components in Ernest seems pretty deep

beginners who like TypeScript

from  3 Lessons from Building a Chrome Extension with TypeScript 

It is striking to me that the "#1 lesson learned" is 

TypeScript feels like Python but JavaScript. It might be a reductive way of describing it, but it feels a lot friendlier to read for a non-techie, self-taught pleb like myself.

The Type interface (even though not as heavily utilized in this project) is amazing. Not struggling with broken code just cause I've inputted a different type that is not supposed to be there.

The streamlined approach to getting what you want, rather than the run around that JS makes you do, helps provide a greater understanding of what is happening from function to function.

I find this such an alien take. For me TS is the opposite of "streamlined" - like I appreciate it for expressing shapes of arbitrary ad hoc JSON formats (though there are other ways of getting that) but am more likely resent the syntax mess it imposes - even by itself JS (and especially with React) is doing more juggling with a pretty small # of charaters ( { }  : =  ?) and so what any given character means in a snippet of code is way more dependent on context, and so harder to read at a glance. And "this only works with a fancy build environment" furthers that lack of a sense of being streamlined. While I know JS in the browser is gonna be some kind of sandbox in a sandbox, having to trust a mapping file to get me from the code I'm inspecting to the code I wrote never feels great. 

I dunno. Sometimes I worry BASIC, Perl and JS broke my brain, and that my feelings that duck typing greases the wheels more than it creates problems and that the speed of a quick interpretation-vs-compiling loop was worth its weight in gold are actually signs of derangement. Maybe Dijkstra was right when he he said

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."

(I always assumed he was talking more about line numbers....)

Like I don't MIND strong typing of primitive types, but would prefer the syntax was prettier than `

let foo: string | undefined;

(like from my Java days , I wish the descriptors came before the variable name)

Also the author didn't have to wrestle with some of TypeScript's inheritance ...

Or maybe brains are just different. 

FOLLOWUP:
On a private Slack we got into talking about the pros/cons of types. 

I wrote

Sometimes it feels like "type" as the term for for both primitive types and key/value conglomerations is weird.

like I know if you were thinking in terms of classes it should all be the same thing. But JS is very much not really that class centric..


Ashish wrote 

For me everything is a type. Records/structs have type. Map/Dictionary is a type. Functions are type (takes parameters of certain type and  returns a result of certain type) etc. i recommend Elm or purescript to check out if you like front end stuff.


My response was:

https://elm-lang.org/examples/buttons - this does not look like it would play well with my head.  Seems like a bit too many layers of abstraction from what I think of as UI these days :smile:

It makes me think at CarGurus when we used https://immutable-js.com/ I think the benefits of predictability you get with immutability were way outweighed by the lack of syntactic sugar and sometimes the performance implications.  (not helped by the wayyy academic documentation for it)

I mean that's the whole thing with functional programming and UI - almost by definition almost everything interesting a UI does is a side effect -(most often showing something to the user) - the very opposite of what functional programming wants to be

(Getting more deeply into React, it feels more and more rickety - like it just puts MVC into one big mess of render for the view and controller to mess with the model but not TOO much)

I think of this (1998!) quote:

Menu items are the modern programmer's way -- even that of the Java programmer, who is too pure of heart to use pointers -- of putting an obscene number of unpredictable GOTO statements everywhere in his code.

jhayward@imsa.edu in 1998, via Usenet's "rec.humor.funny"