tag:blogger.com,1999:blog-87294070870183254472024-03-16T15:09:42.859-04:00Kirk's UI Dev BlogKirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.comBlogger1436125tag:blogger.com,1999:blog-8729407087018325447.post-21159795200669942062024-03-16T15:09:00.001-04:002024-03-16T15:09:06.189-04:00jQuery revisited<p>It's funny, M-W I ran myself a little ragged substitute teaching each night from like 5-10:30.<br /><br />But Wednesday it was a lesson in jQuery - obviously the curriculum is a little long the tooth (pointing out yes, jQuery is passe but still shows up on like 70-80% of websites) but it was like revisiting an old friend you hadn't seen in ages.</p><p><br />Nowadays browsers are pretty much standardized, and the "fetch" library covers AJAX pretty well - two of the sweet spots jQuery fixed - But to think jQuery was making both easy decades ago! For a long time I was a big fan of <a href="https://youmightnotneedjquery.com/">https://youmightnotneedjquery.com/</a> which showed the Vanilla equivalent of most of the important jQuery uses - though the jQuery syntax is generally cleaner. Honestly, jQuery's syntax for animating an item is still much more intuitive than CSS Animations if the situation is imperative and not the result of a screen scroll or what not. (I learned CSS hand-in-hand with jQuery and they were like the perfect complements to take in together)</p><p><br />But try and argue that "hey PHP and jQuery were/are pretty good" and they laugh you out of the room. (Or more realistically, make a mental note that you may be too old and out of it to hire) Ignoring the counter vibe that "oh maybe rendering everything on the client for the last ten years was a mistake" - but server side rendering (in PHP) and dolling things up as needed (in jQuery, or Vanilla JS) is pretty fast and effective AND stable - I have websites that keep running and running with zero maintenance needed, but when it's time to add a feature, that's pretty straight forward as well. (Hell, one of the young punk frameworks is "htmx" - its big trick is stuffing html bits from the server right into the DOM. Which has been a single command in jQuery</p><p><span style="font-family: courier;">$( "#resultdiv" ).load( "ajax/test.html" );</span><br /> </p><p>for years and years.) </p><p>And of course there's always Pieter Levels sticking to a <a href="https://read.engineerscodex.com/p/10-lessons-from-software-side-ventures?utm_source=tldrwebdev">PHP and JQuery stack</a> for multiple sites that each pull in tends of thousands per month. </p><p>Heh, maybe jQuery needs its own <a href="https://slack.engineering/taking-php-seriously/">Taking PHP Seriously</a> page for me to link to when I'm feeling defensive - well, no i do stick with Vanilla.js over jQuery - but it's not that much better, except it feels great to not have the dependency.<br /><br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-30669937170842438192024-03-14T21:41:00.002-04:002024-03-14T21:41:12.024-04:00the 3-minute programmer<p>My partner Melissa got us a cool game - <a href="https://shop.ouisi.co/">OuiSi</a>. Mostly it's a bunch of very pretty, somewhat abstract photo cards, and then the book that comes with it lists different games, some creative, some a bit more competitive.</p><p>One is cooperative yet competitive- "OuiSi Capture" - it plays a bit like "Codenames", the group lays out a 5x5 grid of cards, and each round one person is the designated "clue-giver" and picks one word to describe exactly 5 photos. The other players try to guess the five - each correct card is a point for the communal pile, each wrong guess gives 2 points to the penalty pile - first pile to 25 wins. </p><p>The game suggests arranging 5 coins behind a screen to help remember which cards the clue giver picked, but I thought it would be a little easier to have a webpage to run on a laptop to record the picks. Melissa didn't want to wait for me to program, but I promised her I could get it done in 3 minutes...</p><p>ChatGPT to the rescue!</p><p>The prompt:<br /></p><div class="flex flex-grow flex-col max-w-full"><div class="min-h-[20px] text-message flex flex-col items-start gap-3 whitespace-pre-wrap break-words [.text-message+&]:mt-5 overflow-x-auto" data-message-author-role="user" data-message-id="aaa2711c-88c7-498a-b801-676c3b914365"><div class=""><i>give me an html page with 5 by 5 thing of regular pushbuttons. Each button is a square. Caption for each starts with "_". When it's click it toggles back and forth to "X" and "_"</i></div><div class=""> </div><div class=""><a href="https://stuff.alienbill.com/ouisi/">Here's the result</a>! Worked like a charm <br /></div></div></div>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-67568942918629448702024-03-10T21:59:00.002-04:002024-03-10T21:59:14.267-04:00programming theory and practice<p> </p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU8dUibHld57MS0Kr8_J26tMvEDmBEo7ZxVY205HPWLAbOzOceLMLDhfdZC3IFKUjrY_7u5nz4iaMyO5Hdqel3Pf1rIiolwdz4jsJYpGD55q3uYQiis-V11DZaaRUypb8GkZXrMD3jhI0oEdKQ215goqu1n8hQpw5HAqzG1VCJkjozS3etjvFyfzZRDpNW/s727/funniest-coding-memes-24.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="727" data-original-width="585" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjU8dUibHld57MS0Kr8_J26tMvEDmBEo7ZxVY205HPWLAbOzOceLMLDhfdZC3IFKUjrY_7u5nz4iaMyO5Hdqel3Pf1rIiolwdz4jsJYpGD55q3uYQiis-V11DZaaRUypb8GkZXrMD3jhI0oEdKQ215goqu1n8hQpw5HAqzG1VCJkjozS3etjvFyfzZRDpNW/w321-h400/funniest-coding-memes-24.jpg" width="321" /></a></div><br /><p></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-4658807405477416362024-03-08T15:50:00.005-05:002024-03-08T15:50:50.932-05:00there are RULES<p> </p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJszRq1_PJy7_CM8NZ-Y0h9CHrMM9hUh8nHjI0Naqp8gPUtbONIN8iuNm9_6qjfW82nbi4PLaA_PFhSv6q1HwSi0A7TlxMOooIR8Yy8r3PNgtXuaP6URtBQT2oZ4RgsefBr2ch6NoOkXf4zq5tSsuPBYzlCtCQVGgS4qoKh4W1ObN3WyLzJ5vtjvwTmkgR/s1613/IMG_5343.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1613" data-original-width="1170" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJszRq1_PJy7_CM8NZ-Y0h9CHrMM9hUh8nHjI0Naqp8gPUtbONIN8iuNm9_6qjfW82nbi4PLaA_PFhSv6q1HwSi0A7TlxMOooIR8Yy8r3PNgtXuaP6URtBQT2oZ4RgsefBr2ch6NoOkXf4zq5tSsuPBYzlCtCQVGgS4qoKh4W1ObN3WyLzJ5vtjvwTmkgR/w290-h400/IMG_5343.jpg" width="290" /></a></div><br /><p></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-8142062107152634792024-03-06T08:21:00.005-05:002024-03-06T12:47:29.528-05:00vanilla, it's the finest of the flavors<p><span style="font-family: inherit;"> <span color="rgba(0, 0, 0, 0.9)" face="-apple-system, system-ui, "system-ui", "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; white-space-collapse: preserve;"><a href="https://heydonworks.com/article/what-is-utility-first-css/">What is Utility-First CSS?</a> - ripping into what is almost certainly a terrible paradigm for CSS modeling. </span></span></p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; background-color: white; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: rgba(0, 0, 0, 0.9); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline); white-space-collapse: preserve;"><span style="font-family: inherit;">With so many frameworks and toolkits out there, it feels like a lot of people haven't noticed that Vanilla JS and CSS have gotten pretty good and are certainly reasonable choices for many tasks. I think the secret sauce is browsers that embrace some brilliant emergent standards - and also helped by browsers that update themselves, unlike the bad old days of wondering what version of IE you had to support. (And for better or worse the number of rendering engines has dramatically diminished, so bad quirks are even less common.)
Related: <a href="https://rosswintle.uk/2024/02/a-manifesto-for-small-static-web-apps/">A Manifesto for Small Static Web Apps</a></span></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-47270126963757036082024-03-04T09:14:00.001-05:002024-03-04T09:14:26.468-05:00more on the early prototype<p>My manager encourages us to look to Steve Jobs for inspiration, and I was surprised I hadn't posted about some iPod and iPhone prototypes that were making the rounds a while back - one is this beauty of a <a href="https://panic.com/blog/a-prototype-original-ipod/">breadboard mockup for the first iPod</a>:</p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdSoDiL6sTmb0818nqzUsjZ1q9RrllC-9A3pmEPyhMPuF1XEyj7_wFex-xdwArwCAc6Dkxy-jIh2taNMW3XTT25skeZH0lRTJUY8RrJhEY0HkqgZe0eNrY_iggdwpSPJKBxvquOC_CWtlg_vkI62g4Iy-OMSZ0TScPdzt03q3qsG9IFjFL1NjyanR1paWX/s2560/iPod-1-scaled.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="2560" data-original-width="1708" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdSoDiL6sTmb0818nqzUsjZ1q9RrllC-9A3pmEPyhMPuF1XEyj7_wFex-xdwArwCAc6Dkxy-jIh2taNMW3XTT25skeZH0lRTJUY8RrJhEY0HkqgZe0eNrY_iggdwpSPJKBxvquOC_CWtlg_vkI62g4Iy-OMSZ0TScPdzt03q3qsG9IFjFL1NjyanR1paWX/w268-h400/iPod-1-scaled.jpg" width="268" /></a></div><br /> (I think the screen is about the size of what they had on the first production unit, which gives you an idea of what an absolute unit this is)<br /><p></p><p>The other is <a href="https://sonnydickson.com/2017/01/11/how-apple-picked-what-came-to-be-the-iphone/">competing prototypes for what would run on iPhone</a> - when this video dropped people were surprised that a wheel-based concept was in the running. (But remember how psyched people were that iPhone ran a flavor of OSX? Now it seems inconsequential or obvious, like any gadget running a stripped down version of the linux kernel...)<br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-55275164407910314852024-02-29T09:06:00.008-05:002024-02-29T09:06:49.276-05:00ipod click and skip<p><span class="break-words
"><span><span dir="ltr"><a href="https://www.youtube.com/watch?v=H1dzNuyq6O0">Some hidden history of the iPod</a>. My boss
digs Steve Jobs' attitudes about excellence - and an aptitude for taking
resources at hand (in the case of the iPod, a new small Toshiba hard
drive) and applying them in novel ways. <br /><br />Two things I hadn't heard much about:</span></span></span></p><ol style="text-align: left;"><li><span class="break-words
"><span><span dir="ltr">the signature click wheel has heavily drawn from a phone, the Bang & Olufsen BeoCom 6000</span></span></span></li><li><span class="break-words
"><span><span dir="ltr">Part of the secret sauce was a large 32Mb "skip buffer" - advertised as
"20 minute skip protection" (remember this is an age of jostled
portable CD players leading to poor experience) its true purpose was
buffering of songs, so the device could load a few songs at once rather
than have the little hard drive constantly spinning, and so tripling the
battery life to meet critical performance metrics.</span></span></span></li></ol>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-27434824745854175482024-02-28T21:16:00.004-05:002024-02-28T21:16:51.781-05:00dutch tulips anyone?<p>Really excellent <a href="https://www.tumblr.com/phantomrose96/743538523397832704/why-is-the-whole-economy-like-dutch-tulip-mania-2?source=share">overview charting the first dot com bust through to the current dreams of AI as the savior for tech</a> - going to copy/paste it here for posterity...</p><p>(I would say too, it's really bad the only micro-/nano-transaction model we came up with for the Internet was ads and all the privacy breaking and other nonsense that comes up with.)<br /><br /></p><p></p><blockquote><p>If anyone wants to know why every tech company in the world right now is clamoring for AI like drowned rats scrabbling to board a ship, I decided to make a post to explain what's happening.</p><p><br /></p><p>(Disclaimer to start: I'm a software engineer who's been employed full time since 2018. I am not a historian nor an overconfident Youtube essayist, so this post is my working knowledge of what I see around me and the logical bridges between pieces.)</p><p><br /></p><p>Okay anyway. The explanation starts further back than what's going on now. I'm gonna start with the year 2000. The Dot Com Bubble just spectacularly burst. The model of "we get the users first, we learn how to profit off them later" went out in a no-money-having bang (remember this, it will be relevant later). A lot of money was lost. A lot of people ended up out of a job. A lot of startup companies went under. Investors left with a sour taste in their mouth and, in general, investment in the internet stayed pretty cooled for that decade. This was, in my opinion, very good for the internet as it was an era not suffocating under the grip of mega-corporation oligarchs and was, instead, filled with Club Penguin and I Can Haz Cheezburger websites. </p><p><br /></p><p>Then around the 2010-2012 years, a few things happened. Interest rates got low, and then lower. Facebook got huge. The iPhone took off. And suddenly there was a huge new potential market of internet users and phone-havers, and the cheap money was available to start backing new tech startup companies trying to hop on this opportunity. Companies like Uber, Netflix, and Amazon either started in this time, or hit their ramp-up in these years by shifting focus to the internet and apps.</p><p><br /></p><p>Now, every start-up tech company dreaming of being the next big thing has one thing in common: they need to start off by getting themselves massively in debt. Because before you can turn a profit you need to first spend money on employees and spend money on equipment and spend money on data centers and spend money on advertising and spend money on scale and and and</p><p><br /></p><p>But also, everyone wants to be on the ship for The Next Big Thing that takes off to the moon.</p><p><br /></p><p>So there is a mutual interest between new tech companies, and venture capitalists who are willing to invest $$$ into said new tech companies. Because if the venture capitalists can identify a prize pig and get in early, that money could come back to them 100-fold or 1,000-fold. In fact it hardly matters if they invest in 10 or 20 total bust projects along the way to find that unicorn. </p><p><br /></p><p>But also, becoming profitable takes time. And that might mean being in debt for a long long time before that rocket ship takes off to make everyone onboard a gazzilionaire. </p><p><br /></p><p>But luckily, for tech startup bros and venture capitalists, being in debt in the 2010's was cheap, and it only got cheaper between 2010 and 2020. If people could secure loans for ~3% or 4% annual interest, well then a $100,000 loan only really costs $3,000 of interest a year to keep afloat. And if inflation is higher than that or at least similar, you're still beating the system. </p><p><br /></p><p>So from 2010 through early 2022, times were good for tech companies. Startups could take off with massive growth, showing massive potential for something, and venture capitalists would throw infinite money at them in the hopes of pegging just one winner who will take off. And supporting the struggling investments or the long-haulers remained pretty cheap to keep funding. </p><p><br /></p><p>You hear constantly about "Such and such app has 10-bazillion users gained over the last 10 years and has never once been profitable", yet the thing keeps chugging along because the investors backing it aren't stressed about the immediate future, and are still banking on that "eventually" when it learns how to really monetize its users and turn that profit.</p><p><br /></p><p>The pandemic in 2020 took a magnifying-glass-in-the-sun effect to this, as EVERYTHING was forcibly turned online which pumped a ton of money and workers into tech investment. Simultaneously, money got really REALLY cheap, bottoming out with historic lows for interest rates. </p><p><br /></p><p>Then the tide changed with the massive inflation that struck late 2021. Because this all-gas no-brakes state of things was also contributing to off-the-rails inflation (along with your standard-fare greedflation and price gouging, given the extremely convenient excuses of pandemic hardships and supply chain issues). The federal reserve whipped out interest rate hikes to try to curb this huge inflation, which is like a fire extinguisher dousing and suffocating your really-cool, actively-on-fire party where everyone else is burning but you're in the pool. And then they did this more, and then more. And the financial climate followed suit. And suddenly money was not cheap anymore, and new loans became expensive, because loans that used to compound at 2% a year are now compounding at 7 or 8% which, in the language of compounding, is a HUGE difference. A $100,000 loan at a 2% interest rate, if not repaid a single cent in 10 years, accrues to $121,899. A $100,000 loan at an 8% interest rate, if not repaid a single cent in 10 years, more than doubles to $215,892.</p><p><br /></p><p>Now it is scary and risky to throw money at "could eventually be profitable" tech companies. Now investors are watching companies burn through their current funding and, when the companies come back asking for more, investors are tightening their coin purses instead. The bill is coming due. The free money is drying up and companies are under compounding pressure to produce a profit for their waiting investors who are now done waiting. </p><p><br /></p><p>You get enshittification. You get quality going down and price going up. You get "now that you're a captive audience here, we're forcing ads or we're forcing subscriptions on you." Don't get me wrong, the plan was ALWAYS to monetize the users. It's just that it's come earlier than expected, with way more feet-to-the-fire than these companies were expecting. ESPECIALLY with Wall Street as the other factor in funding (public) companies, where Wall Street exhibits roughly the same temperament as a baby screaming crying upset that it's soiled its own diaper (maybe that's too mean a comparison to babies), and now companies are being put through the wringer for anything LESS than infinite growth that Wall Street demands of them. </p><p><br /></p><p>Internal to the tech industry, you get MASSIVE wide-spread layoffs. You get an industry that used to be easy to land multiple job offers shriveling up and leaving recent graduates in a desperately awful situation where no company is hiring and the market is flooded with laid-off workers trying to get back on their feet. </p><p><br /></p><p>Because those coin-purse-clutching investors DO love virtue-signaling efforts from companies that say "See! We're not being frivolous with your money! We only spend on the essentials." And this is true even for MASSIVE, PROFITABLE companies, because those companies' value is based on the Rich Person Feeling Graph (their stock) rather than the literal profit money. A company making a genuine gazillion dollars a year still tears through layoffs and freezes hiring and removes the free batteries from the printer room (totally not speaking from experience, surely) because the investors LOVE when you cut costs and take away employee perks. The "beer on tap, ping pong table in the common area" era of tech is drying up. And we're still unionless. </p><p><br /></p><p>Never mind that last part.</p><p><br /></p><p>And then in early 2023, AI (more specifically, Chat-GPT which is OpenAI's Large Language Model creation) tears its way into the tech scene with a meteor's amount of momentum. Here's Microsoft's prize pig, which it invested heavily in and is galivanting around the pig-show with, to the desperate jealousy and rapture of every other tech company and investor wishing it had that pig. And for the first time since the interest rate hikes, investors have dollar signs in their eyes, both venture capital and Wall Street alike. They're willing to restart the hose of money (even with the new risk) because this feels big enough for them to take the risk.</p><p><br /></p><p>Now all these companies, who were in varying stages of sweating as their bill came due, or wringing their hands as their stock prices tanked, see a single glorious gold-plated rocket up out of here, the likes of which haven't been seen since the free money days. It's their ticket to buy time, and buy investors, and say "see THIS is what will wring money forth, finally, we promise, just let us show you."</p><p><br /></p><p>To be clear, AI is NOT profitable yet. It's a money-sink. Perhaps a money-black-hole. But everyone in the space is so wowed by it that there is a wide-spread and powerful conviction that it will become profitable and earn its keep. (Let's be real, half of that profit "potential" is the promise of automating away jobs of pesky employees who peskily cost money.) It's a tech-space industrial revolution that will automate away skilled jobs, and getting in on the ground floor is the absolute best thing you can do to get your pie slice's worth.</p><p><br /></p><p>It's the thing that will win investors back. It's the thing that will get the investment money coming in again (or, get it second-hand if the company can be the PROVIDER of something needed for AI, which other companies with venture-back will pay handsomely for). It's the thing companies are terrified of missing out on, lest it leave them utterly irrelevant in a future where not having AI-integration is like not having a mobile phone app for your company or not having a website. </p><p><br /></p><p>So I guess to reiterate on my earlier point: </p><p><br /></p><p>Drowned rats. Swimming to the one ship in sight. </p><p><br /></p></blockquote><p> </p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-83299960738796958572024-02-21T14:46:00.003-05:002024-02-21T14:46:14.226-05:00AI-yi-yi<p> </p><p><a href="https://www.wheresyoured.at/sam-altman-fried/">Subprime Intelligence</a> - Thoughtful piece about what is the strength of the boom in AI - especially as we may be nearing the top of an S-shaped curve of capabilities.</p><p>I think it's a great point that for much of the generative stuff, the "killer app" isn't there (though there are a lot of small use cases, and LLMs are certainly a boon to helping software developers navigate the thick woods of toolkits and languages we've created.) </p><p>It will also be interesting to see how the public's "AI-Radar" increases, and how much people start to notice and possibly get annoyed at the garish flatness of it all. </p><p>But especially the economics... hearing how Microsoft's "investment" in OpenAI was more like a donation of computer time, and other investments seem more about tech giants trying to ensure they're the ones selling the raw cloud horsepower... sometimes you wonder where we are all going with this.</p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-59256701683792932132024-02-21T08:14:00.006-05:002024-02-21T08:14:45.510-05:00catching up on newsletters<p>Catching up on frontend newsletters...</p><ul style="text-align: left;"><li> I'm always inpressed by <a href="https://thecodingtrain.com/">The Coding Train</a> - I love p5/processing and these videos seem like a great way to learn programming.</li><li>This <a href="https://tonsky.me/blog/checkbox/">rise and possible fall of the square checkbox</a> was a fun visual history.</li><li>The <a href="https://daverupert.com/2024/02/ui-states/?ck_subscriber_id=2241809034">state of the states</a>...</li><li>Back in the caveman days of Perl CGI, it sort of made sense that I used the same language for quick and dirty shellscripts and web programming.. but now that my go to is PHP for the latter (at least for the serverside)? Having a command line script inside <?php ?> always feels weird. Maybe I should <a href="https://bun.sh/blog/the-bun-shell">switch to Bun</a> for that?</li><li><a href="https://naildrivin5.com/blog/2024/01/24/web-components-in-earnest.html">Web Components in Ernest</a> seems pretty deep<br /></li></ul>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-83797963838038934392024-02-21T07:10:00.003-05:002024-02-21T13:11:34.065-05:00beginners who like TypeScript<p>from <a href="https://shawnsomething.substack.com/p/3-lessons-from-building-a-chrome">3 Lessons from Building a Chrome Extension with TypeScript</a> </p><p>It is striking to me that the "#1 lesson learned" is </p><p></p><blockquote><p>TypeScript feels like Python but JavaScript. It might be a reductive way of describing it, but it feels a lot friendlier to read for a non-techie, self-taught pleb like myself.</p><p>The Type interface (even though not as heavily utilized in this project) is amazing. Not struggling with broken code just cause I've inputted a different type that is not supposed to be there.</p><p>The streamlined approach to getting what you want, rather than the run around that JS makes you do, helps provide a greater understanding of what is happening from function to function.</p></blockquote><p></p><p>I find this such an alien take. For me TS is the opposite of "streamlined" - like I appreciate it for expressing shapes of arbitrary ad hoc JSON formats (though there are other ways of getting that) but am more likely resent the syntax mess it imposes - even by itself JS (and especially with React) is doing more juggling with a pretty small # of charaters ( { } : = ?) and so what any given character means in a snippet of code is way more dependent on context, and so harder to read at a glance. And "this only works with a fancy build environment" furthers that lack of a sense of being streamlined. While I know JS in the browser is gonna be some kind of sandbox in a sandbox, having to trust a mapping file to get me from the code I'm inspecting to the code I wrote never feels great. </p><p>I dunno. Sometimes I worry BASIC, Perl and JS broke my brain, and that my feelings that duck typing greases the wheels more than it creates problems and that the speed of a quick interpretation-vs-compiling loop was worth its weight in gold are actually signs of derangement. Maybe Dijkstra was right when he he said</p><blockquote><p style="text-align: left;">"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."</p></blockquote><p style="text-align: left;">(I always assumed he was talking more about line numbers....)</p><p>Like I don't MIND strong typing of primitive types, but would prefer the syntax was prettier than `</p><p></p><blockquote>let foo: string | undefined;</blockquote><p></p><p>(like from my Java days , I wish the descriptors came before the variable name)</p><p>Also the author didn't have to wrestle with some of TypeScript's inheritance ...</p><p>Or maybe brains are just different. </p><p>FOLLOWUP:<br />On a private Slack we got into talking about the pros/cons of types. <br /><br />I wrote<br /></p><p></p><blockquote><p>Sometimes it feels like "type" as the term for for both primitive types and key/value conglomerations is weird.</p><p>like I know if you were thinking in terms of classes it should all be the same thing. But JS is very much not really that class centric..</p></blockquote><p></p><p><br />Ashish wrote </p><blockquote><p>For me everything is a type. Records/structs have type. Map/Dictionary is a type. Functions are type (takes parameters of certain type and returns a result of certain type) etc. i recommend Elm or purescript to check out if you like front end stuff.</p></blockquote><p><br /></p><p>My response was:</p><p><a href="https://elm-lang.org/examples/buttons">https://elm-lang.org/examples/buttons</a> - this does not look like it would play well with my head. Seems like a bit too many layers of abstraction from what I think of as UI these days :smile:</p><p>It makes me think at CarGurus when we used https://immutable-js.com/ I think the benefits of predictability you get with immutability were way outweighed by the lack of syntactic sugar and sometimes the performance implications. (not helped by the wayyy academic documentation for it)</p><p>I mean that's the whole thing with functional programming and UI - almost by definition almost everything interesting a UI does is a side effect -(most often showing something to the user) - the very opposite of what functional programming wants to be <br /><br />(Getting more deeply into React, it feels more and more rickety - like it just puts MVC into one big mess of render for the view and controller to mess with the model but not TOO much)</p><p>I think of this (1998!) quote:</p><p></p><blockquote><p>Menu items are the modern programmer's way -- even that of the Java programmer, who is too pure of heart to use pointers -- of putting an obscene number of unpredictable GOTO statements everywhere in his code.</p><p>jhayward@imsa.edu in 1998, via Usenet's "rec.humor.funny"</p></blockquote><p></p><p><br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-89253188215602250022024-02-14T11:35:00.006-05:002024-02-14T11:35:57.677-05:00sequence diagrams<p>This is 101 stuff, but: <a href="https://swagger.io/">swagger</a> is pretty great at showing you the exposed endpoints and letting you play with them, but it doesn't always provide a good sense of information flow. </p><p>A sequence diagram can help, even if there are just two agents:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXmbAVUTzMcAnuiVlQCiJpJ694KOgLbUuXKOV3gSoNuECv5u-4P7YnQcUruBVDIGWYqiZQ6utPLF4mzE6JtZAxe0vql57RpSA7SXrv_uPwMT2DXqXrpSvjTKNQYy_-sDLvrto4Pf-h03Bg_VQtv0LbjzTL15nmZgU-pUe-Uzd0iYmnWD0WlQb1PgbkohyphenhyphenB/s393/Untitled.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="272" data-original-width="393" height="276" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXmbAVUTzMcAnuiVlQCiJpJ694KOgLbUuXKOV3gSoNuECv5u-4P7YnQcUruBVDIGWYqiZQ6utPLF4mzE6JtZAxe0vql57RpSA7SXrv_uPwMT2DXqXrpSvjTKNQYy_-sDLvrto4Pf-h03Bg_VQtv0LbjzTL15nmZgU-pUe-Uzd0iYmnWD0WlQb1PgbkohyphenhyphenB/w400-h276/Untitled.png" width="400" /></a></div><br /><p><a href="https://sequencediagram.org/">sequencediagram.org</a> seems like a fantastic option! It uses a simple text based format (almost akin to markdown) but also lets you do some of the editing right on the chart. (Reminds me of my habit of making a <a href="https://kirkdev.blogspot.com/2022/06/making-pill-tracking-sheets-in-php.html">first pass of a UI that's just a big textarea using an adhoc data format for each line</a>- it's rather "expert mode" stuff but can postpone having to make a proper drag and drop or similar tool for reordering blocks, plus you get a convenient saving/sharing mechanism.)<br /><br />The one slightly hidden bit of that web app is where to get a URL to share (which for me is one of the main attractions for collaboration) - it's the 4th action icon down that implies "export", then it's "URL to Share"/Create. <br /></p><p><br /><br /><br /><br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-9416822835458295102024-02-13T09:38:00.004-05:002024-02-13T09:38:30.259-05:00what (not) to do infrastructure-wise<p> Interesting piece: <a href="https://cep.dev/posts/every-infrastructure-decision-i-endorse-or-regret-after-4-years-running-infrastructure-at-a-startup/">(Almost) Every infrastructure decision I endorse or regret after 4 years running infrastructure at a startup</a></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-75147899193143505552024-02-12T17:38:00.000-05:002024-02-12T17:38:12.278-05:00sshfs to mount remote file systems on macs!<p> Pretty cool way of mounting a remote Unix filesystem: <a href="https://www.petergirnus.com/blog/how-to-use-sshfs-on-macos">How To Use SSHFS on macOS</a></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-35054750702522134722024-02-09T11:08:00.005-05:002024-02-09T11:08:50.770-05:00we are all strange loops<p>One of my favorite books is Douglas Hofstadter's"I Am a Strange Loop", which picks up some threads from his more famous "Gödel Escher Bach" but also reflects him coping with the death of his soulmate wife. Hofstadter is trying to see if his ability to have conversations with Carol (based on his earlier history of a proven, high-fidelity ability to predict just what she would say) could in a way BE her living on in his head and heart - in a philosophical (and not merely poetic) way, or if that was just a consoling bit of wishful thinking.<br /><br />In talking about mind and consciousness, he constructs a playful physical metaphor of the "careenium" - a bouncing-magnetized-pinballs thought experiment of how we come to model the world in our own craniums - a model rich enough to include ourselves as a model doing the modeling, and so on and so on.</p><p>Googling to try to remember the term "Careenium", I found this page <a href="https://mybrainsthoughts.com/?p=234">explaining the concept and comparing it to GPT</a>. One challenge you run into if you collaborate with GPT is that's it's not doing a great job of modeling the problem at hand in its virtual head: its model of the world is fairly static, and a conversation with it (as impressive as it is! Especially if you've played with the previous fruits of AI over the past decades) is just a probabilistic word journey through that static space. </p><p>In some ways it's right there in the name: GPT means "Generative Pre-trained Transformer", and the problem is the "Pre" - and earlier "Transfomers" were notably worse at keeping track of what was just said in the dialog. <br /><br />I guess the implication is if GPT had greatly increased abilities to update its own model on the fly, if that process was more organically bootstrapped and ongoing, it might be a better candidate for "true" Artificial General Intelligence and even consciousness... <br /></p><p><br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-9146427328120703502024-02-08T15:44:00.000-05:002024-02-08T15:44:12.222-05:00null: the billion dollar mistake<p> I tried to look smart today when one of my teammates mentioned realizing he was finding new problems with a data stream because previously some other process was treating null as zero... <br /><br />It had me hunt down this quote from <a href="https://en.wikipedia.org/wiki/Tony_Hoare">Tony Hoare</a> on the invention of null:</p><blockquote><p>I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.</p></blockquote><p>Heh. <br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-17415174642750808672024-02-08T12:37:00.004-05:002024-03-06T13:21:13.172-05:00Mac Phantom Messages Badge - how to remove<p><i>NEWER UPDATE: </i></p><p><i>The trick is to activate Siri on the Mac if she isn't already, and then say "Siri read me my unread messages". It dug out 6 messages from years ago that were for whatever reason not showing up in the Messages app</i></p><p>I found a few partial solutions to my Mac (Ventura 13.4) always showing 6 Unread Messages (none of my other devices showed any) on the Dock but either they weren't doable as listed or didn't work.<br /><br />I believe the steps were: (after making sure that indeed, all messages seemed to be read)<br /></p><ol style="text-align: left;"><li>Quit Messages (after making sure that all messages do indeed seem read - you should be able to go to View | Unread Messages)<br /></li><li>Go to System Settings | Notifications</li><li>Scroll down to Messages and click to open the panel</li><li>Turn off "Badge application icon"</li><li>In terminal window, click "killall Dock" (I suspect you might be able to use Force Quit Finder? but I haven't tested that way) The Dock should go away then bounce back.</li><li>Start up Messages</li><li>Turn "Badge application icon" back on</li></ol><p>Good luck! <br /><br /><i>UPDATE: I lied. At the next message I got the number popped back up to 7. The ghost messages must be living in iCloud land...</i><br /></p><p><i><br /></i></p><p><br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com5tag:blogger.com,1999:blog-8729407087018325447.post-84173459642452484712024-02-02T17:55:00.008-05:002024-02-02T17:55:46.438-05:00the pop, the rise, the fall of tech jobs<p> </p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxcx6drkHyskVyekaT8qPOQqpDJ3K0Ds3lc7RxblIh-5P733sSxhO0fyb0hvaSJ00IGruSNcBEx_Az3Oo5YdREM8xPsiM_7r31HYev-y6_a9eKjp69I24viBzAHtkD6yfd-sKOn_VERg2CQG6a3daHBAq7xWS95GVU0lBgEbNneFZSjQSlxmVlGMYnFYio/s1488/Image%20from%20iOS.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1087" data-original-width="1488" height="293" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxcx6drkHyskVyekaT8qPOQqpDJ3K0Ds3lc7RxblIh-5P733sSxhO0fyb0hvaSJ00IGruSNcBEx_Az3Oo5YdREM8xPsiM_7r31HYev-y6_a9eKjp69I24viBzAHtkD6yfd-sKOn_VERg2CQG6a3daHBAq7xWS95GVU0lBgEbNneFZSjQSlxmVlGMYnFYio/w400-h293/Image%20from%20iOS.jpg" width="400" /></a></div><br /><p></p><p>via <a href="https://www.statista.com/statistics/199979/number-of-employees-in-the-us-information-sector/">Statista</a> - I've really been wondering about this chart, especially hearing how the USA added a surprising chunk of jobs in January. </p><p>Tech jobs took a massive hit at the start of quarantine but was back where it was in about a year - and kept on going.</p><p>This chart makes it look we're still about about where we might have been otherwise, but this is as of October 2023... and I haven't seen too many signs that the roller coaster back down leveled out or resumed its ascension. </p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-34192740164501959242024-01-31T20:39:00.006-05:002024-01-31T20:43:06.304-05:00sRGB with the DCM<p> I was doing some good old fashioned "Digital Color Meter"ing (a Mac app) to pluck colors out of a screen mockup - I'd get the hex values, put them into an HTML document, and then they were visibly darker... and DCM confirmed, it was a different shade reading out than I was putting in.<br /><br />Turns out, you really gotta get the right setting for the output display:<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWX3cVJjr2ORYIO3A50syLNeJ-rdOpVwpy6VDVyiOXiso96lIZa1SZrMQftWk8DQ7kOV0DKfBaljdGGd18OUALs4RSeqaH8sB-VVXvn-smQF6iC03-DkgvQ0gYcBQbo4UaY-S00JqeL7ZcvNHtYNoBqBsXg71IzG0DhyoaZon1woIgZC5DTsgJYQ4YB6Ys/s786/Screenshot%202024-01-31%20at%208.38.13%20PM.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="474" data-original-width="786" height="193" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWX3cVJjr2ORYIO3A50syLNeJ-rdOpVwpy6VDVyiOXiso96lIZa1SZrMQftWk8DQ7kOV0DKfBaljdGGd18OUALs4RSeqaH8sB-VVXvn-smQF6iC03-DkgvQ0gYcBQbo4UaY-S00JqeL7ZcvNHtYNoBqBsXg71IzG0DhyoaZon1woIgZC5DTsgJYQ4YB6Ys/s320/Screenshot%202024-01-31%20at%208.38.13%20PM.png" width="320" /></a></div>"sRGB" seemed to get the most accurate color readings... "In the world of web browsers, the sRGB color space is the standard."Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-18062166635373620382024-01-31T11:26:00.002-05:002024-01-31T11:36:08.758-05:00the UX and Community-eXperience of remote vs in-office<p><a href="https://gizmodo.com/more-proof-that-return-to-office-is-pointless-1851209231">There’s More Proof That Return to Office Is Pointless</a> - <span color="var(--color-text)" style="font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal);">I have mixed feelings on the "Return to Office" debate. Starting my current second Remote job in a row, I appreciate the flexibility of my daily schedule and lack of commute - but I also sometimes resent how my company partially "owns" my office space and makes it a little less fun for my personal side projects.</span></p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);">I'm also aware of how many companies and teams haven't leaned into building a sense of online community. There's a UX story here: achieving a virtual group presence and ongoing conversation is a harder row to hoe with MS Teams than some other platforms. Each Teams space feels like a collection of public email threads (with a clunky Word-like formatting tool ribbon front and center) vs the more chat-y experience of Slack, where there is a single conversation with threads branching off. </p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);"><br style="box-sizing: inherit;" /></p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);">(It seems Google Chat recently pivoted to be more like Slack - so old threads with new replies are no longer moved up. But Slack has a gloriously clever bit of UX with a checkbox to copy any given thread reply to the main channel - this empowers a user to selectively bring up an old conversation that might have dropped off other folk's radar.)</p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);"><br style="box-sizing: inherit;" /></p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);">Of course, the tech matters a bit less than the mood of the folks using it - it's a cultural thing, and I'm not sure yet if that best comes from the top down or can be made organically. It's a little disheartening when you see a bunch of unused channels, where the last post was months and months ago. (I think one UX takeaway there is don't get TOO fine-grained with your room/channel creation. In theory specialty topic channels might encourage discussion in that area, but in practice it can dilute an already small pool of participants.)</p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);"><br style="box-sizing: inherit;" /></p><p style="--artdeco-reset-typography_getfontsize: 1.6rem; --artdeco-reset-typography_getlineheight: 1.5; border: var(--artdeco-reset-base-border-zero); box-sizing: inherit; color: var(--color-text); counter-reset: list-1 0 list-2 0 list-3 0 list-4 0 list-5 0 list-6 0 list-7 0 list-8 0 list-9 0; cursor: text; font-size: inherit; font-weight: var(--artdeco-reset-typography-font-weight-normal); line-height: var(--artdeco-reset-typography_getLineHeight); margin: 0px; padding: 0px; vertical-align: var(--artdeco-reset-base-vertical-align-baseline);">Getting back to other Hybrid and In-Office concepts, I do worry that as a remote worker I am more of a fungible commodity than I would be otherwise. No matter how one feels about a bias against remote work, living near Boston, I wonder if there's a competitive advantage to having hybrid or in-office as an option? One that personally I would have been ok with, though I like companies that respect employee's countering opinions on the matter.</p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-7720527330690743562024-01-30T09:45:00.002-05:002024-01-30T09:45:18.052-05:00new CSS goodness<p><a href="https://moderncss.dev/12-modern-css-one-line-upgrades/">12 Modern CSS One-Line Upgrades</a>. CSS is getting good, and automatic browser updates means you can leverage this stuff.<br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-12055837803680808672024-01-25T17:05:00.008-05:002024-01-29T11:27:38.094-05:00swipe before/after slider in react and elsewhere<p>For work we wanted to look into making a side-by-side/before-after slider/swipe tool thing. There are number of those available for React, but many of them seem hard coded to do images only, not arbitrary DOM stuff.</p><p>As a warm up I handrolled a <a href="https://editor.p5js.org/kirkjerk/sketches/CjP-QTVYz">version in P5.js</a> - it's still just using images but it gave me an idea of what I was up against. (It's sneakily tricky to make a cut out like this-- since they virtually occupy the same space, you can't just have one in front of the other.)</p><p>I found two ways to do it in P5, where "mid" is the X-coordinate change over point:<br /></p><p>You can draw the left image, and then start drawing the right image at mid, making sure to draw from the offscreen image at that point only: </p><pre>image(imgBefore, 0, 0,width,height);
image(imgAfter, mid, 0,width,height,mid,0,width,height)</pre><p>or you can draw the right image, and then just part of the left image:</p><pre>image(imgAfter, 0, 0,width,height);
copy(imgBefore, 0, 0, mid, height, 0, 0, mid, height);</pre><p>But of course we wanted to do it in React. I fired up ChatGPT to give a hand and came up with the content of <a href="https://codepen.io/kirkjerk/pen/JjzymzO">this CodePen</a>. (I also did a <a href="https://codepen.io/kirkjerk/pen/VwRzEwb">version with the images</a> just so it looked better)</p><p>The JS was mostly about the slider control and the jokey content, but it had that critical clipPath which was the special sauce<br /><br /></p><pre>const { useState, useRef } = React;
const BeforeAfterSlider = () => {
const [sliderPosition, setSliderPosition] = useState(50);
const sliderContainerRef = useRef(null);
const handleMouseDown = () => {
document.addEventListener('mousemove', handleMouseMove);
document.addEventListener('mouseup', handleMouseUp);
};
const handleMouseMove = (e) => {
const rect = sliderContainerRef.current.getBoundingClientRect();
let newPosition = (e.clientX - rect.left) / rect.width * 100;
// Clamp between 0 and 100
newPosition = Math.max(0, Math.min(newPosition, 100));
setSliderPosition(newPosition);
};
const handleMouseUp = () => {
document.removeEventListener('mousemove', handleMouseMove);
document.removeEventListener('mouseup', handleMouseUp);
};
const SampleContent = ({className}) => {
return <div className={className}>Lorem ipsum dolor sit amet,
consectetur adipiscing elit, sed do eiusmod tempor incididunt
ut labore et dolore magna aliqua. Ut enim ad minim veniam,
quis nostrud exercitation ullamco laboris nisi ut aliquip ex
ea commodo consequat. Duis aute irure dolor in reprehenderit
in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Excepteur sint occaecat cupidatat non proident, sunt in culpa qui
officia deserunt mollit anim id est laborum.</div>;
}
return (
<div className="slider-container" ref={sliderContainerRef}>
<div
className="before-content"
style={{ clipPath: `inset(0 ${100 - sliderPosition}% 0 0)` }}
>
<SampleContent className='beforeGuts'/>
</div>
<div
className="after-content"
style={{ clipPath: `inset(0 0 0 ${sliderPosition}%)` }}
>
<SampleContent className='afterGuts'/>
</div>
<div
className="slider-handle"
style={{ left: `${sliderPosition}%` }}
onMouseDown={handleMouseDown}
/>
</div>
);
};
ReactDOM.render(<beforeafterslider>, document.getElementById('root'));
</beforeafterslider></pre><p>oh and there was some CSS</p>
<pre>.slider-container {
position: relative;
width: 100%;
max-width: 600px;
height: 300px;
overflow: hidden;
}
.before-content,
.after-content {
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
background-size: cover;
background-position: center;
clip-path: inset(0 50% 0 0); /* Initially half */
}
.before-content {
}
.after-content {
clip-path: inset(0 0 0 50%); /* Initially half */
}
.slider-handle {
position: absolute;
top: 0;
bottom: 0;
width: 5px;
background: black;
cursor: ew-resize;
}
.beforeGuts,.afterGuts{
font-size:20px;
font-family:sans-serif;
font-weight:bold;
width:600px;
background-color:white;
}
.beforeGuts {
color: red;
}
.afterGuts {
color: blue;
}</pre><p>Took some iterating with ChatGPT but we got there.<br /><br /><br />UPDATE: here is my final version using Styled-Components, and changing the interface to use the first 2 children....(note with styled components I had to use to the attrs() instead of creating new classes each time the slider moved:<br /><br /><br /></p><pre>import React, { useState, useRef } from 'react';
import styled from 'styled-components';
// Styled components
const SliderContainer = styled.div`
position: relative;
width: 100%;
height: 100%;
overflow: hidden;
`;
const Content = styled.div.attrs((props) => ({
style: {
clipPath: props.clipPath,
},
}))`
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
background-size: cover;
background-position: center;
`;
const SliderHandle = styled.div.attrs((props) => ({
style: {
left: props.left,
},
}))`
position: absolute;
top: 0;
bottom: 0;
width: 5px;
background: red;
cursor: ew-resize;
display: flex;
align-items: center;
`;
const SliderHandleCircle = styled.div`
width: 40px;
height: 40px;
background-color: transparent;
border: 6px solid red;
border-radius: 50%;
position: absolute;
left: -18px;
`;
// we are assuming exactly two children
const SwipeSlider = ({ children }) => {
const [sliderPosition, setSliderPosition] = useState(50);
const sliderContainerRef = useRef(null);
const [leftContent, rightContent] = children;
const handleMouseDown = () => {
document.addEventListener('mousemove', handleMouseMove);
document.addEventListener('mouseup', handleMouseUp);
};
const handleMouseMove = (e) => {
const rect = sliderContainerRef.current.getBoundingClientRect();
let newPosition = ((e.clientX - rect.left) / rect.width) * 100;
newPosition = Math.max(0, Math.min(newPosition, 100)); // Clamp between 0 and 100
setSliderPosition(newPosition);
};
const handleMouseUp = () => {
document.removeEventListener('mousemove', handleMouseMove);
document.removeEventListener('mouseup', handleMouseUp);
};
return (
<SliderContainer ref={sliderContainerRef}>
<Content className="left-content-wrapper" clipPath={`inset(0 ${100 - sliderPosition}% 0 0)`}>
{leftContent}
</Content>
<Content className="right-content-wrapper" clipPath={`inset(0 0 0 ${sliderPosition}%)`}>
{rightContent}
</Content>
<SliderHandle left={`${sliderPosition}%`} onMouseDown={handleMouseDown}>
<SliderHandleCircle />
</SliderHandle>
</SliderContainer>
);
};
export default SwipeSlider;
</pre>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-50661911821992493812024-01-24T09:02:00.006-05:002024-01-25T15:01:51.952-05:00switch<p>At my new job I'm asked to use Windows, which I haven't used on the regular for over a decade. (Though it seems like new versions of Windows got rid of the bloatware and weird early touchscreen UI issues)<br /><br />I am nervous but I like CreativePlankton's thought on <a href="https://www.reddit.com/r/MacOS/comments/rxzpyd/for_powerusers_who_use_both_windows_and/">this reddit thread on the topic</a>:</p><p></p><blockquote>I'm an IT guy born and raised on the PC, and hated Apple. Then I got a job where the company was 60% Mac and 40% PC. I figured I better learn the Mac. After a couple of weeks I felt pretty comfortable with the Mac. After a couple of months I gave no thought to the computer, just the task that needed to be done. It's kind of like driving two different cars. The various controls are in different places, but both have basically the same things. Personally, I now prefer the Mac, but seriously don't over think it. If you're in tech for any length of time knowing multiple platforms will serve you well. </blockquote>I still have to find replacements for a ton of little helper programs.<br /><p></p><div><br /></div><div>UPDATE:</div><div>Pivoting back to Windows (after ten years!) to align with my company; it's better than I had feared but I'm still surprised my desire to use the scroll wheel up to make the document "go up" is so weird none of the archaeological layers of Windows settings covers it; I either have to tweak the registry (!) or install the Logitech management software.</div><div><br /></div><div><br /></div><div>Which movement is "up" for scroll wheels, touchpads, and 3D games - and how different people have different intuitions about how it should be - is an intriguing <a href="https://kirkdev.blogspot.com/2020/09/the-ux-of-scroll-direction-natural-and.html">interesting UX issue I wrote about in my devblog a few years ago</a></div>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-20515296854180433552024-01-22T11:31:00.003-05:002024-01-22T11:32:08.914-05:00so you think you know C<p><a href="https://wordsandbuttons.online/so_you_think_you_know_c.html">So You Think You Know C</a> - this is what those JavaScript WAT (usually about truth-y values) feels like to me. Yes, some bits are wonky, but honestly 90% of the errors I make for myself because of loose typing, say, would be fixed if "+" wasn't both "addition" and "string concatenation" <br /></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0tag:blogger.com,1999:blog-8729407087018325447.post-91126958188783080472024-01-20T10:01:00.002-05:002024-01-20T13:50:04.910-05:00from perl to php to life with ChatGPT<p>On a private Slack, I quoted this line from yesterday's email domain validation learning - <br /><span style="font-family: courier;"> if (!filter_var($email, FILTER_VALIDATE_EMAIL)) {</span><br />(so filter_var returns false if it's not a match) and a friend pointed out:<br /></p><p></p><blockquote>`filter_var()` is exactly the kind of API abomination PHP is known for, a+++ language design</blockquote><p></p><p>He's not wrong! That's pretty ugly - the name of the function is kind of negative, so you're asking people to think in double negative to use it properly (vs a more typical isValid() type naming)<br /><br />I was trying to reconstruct my history with PHP; I taught myself Perl in college, and .cgi saw me through a number of years. PHP was a boon because the useful functions were built in - often installing libraries for Perl required root access I didn't always have, and for a long time I was always copying this boiler plate into my CGI scripts:<br /><span style="font-family: courier;"><br />sub chugCGI { local (*in) = @_ if @_; local ($i, $key, $val);<br /> if($ENV{'REQUEST_METHOD'}eq"POST") { read(STDIN,$in,$ENV{'CONTENT_LENGTH'});<br /> }else {$in = $ENV{'QUERY_STRING'};} @in = split(/&/,$in);<br /> foreach $i (0 .. $#in) { $in[$i] =~ s/\+/ /g;<br />($key, $val) = split(/=/,$in[$i],2);$key =~ s/%(..)/pack("c",hex($1))/ge;<br />$val =~ s/%(..)/pack("c",hex($1))/ge; $in{$key}.= $val;}return length($in);}</span><br /><br />which read CGI variables and put them in a assoc. array when CGI.pm or CGI_lite.pm wasn't at hand.<br />So PHP was obviously a step up. But when I first tried learning it (in the first dot com crash) around Y2K, it really was even less ready for prime time - IIRC if you walked an array you had to like reset an iterator that was part of the array somehow? and 2 dimensional arrays were a right mess, and the documentation was wrong... basically it felt slapdash in a way that Perl never did, even though later I found out Perl was actually a glue language over C Unix System Calls - which blew my mind because I thought Perl (with its first class strings and associative arrays and lack of memory management, or compiling for that matter) was like that anti-C.<br /><br />But over the years it became my default for backend stuff - the Apache monolith LAMP model just gets out of the way of the front end stuff I really care about (and I still direct folks to <a href="https://slack.engineering/taking-php-seriously/">the "Taking PHP Seriously" article</a> when feeling defensive about that) but the weird thing was - since I learned it in a google era, I hardly memorized any of it. Its API does have a lot of weird edges (especially in terms of if you have a function with arguments of an array and a single member, which argument is likely to appear first? for <span style="font-family: courier;">in_array()</span> the needle comes before the haystack, but for <span style="font-family: courier;">array_push()</span> the array comes first... which makes some kind of sense in terms of allowing you to add multiple things at once but still doesn't scream consistency.) But still, relative to Perl, there were so many useful functions builtin... and also it didn't have all that weird syntax that makes "Perl golf" a hobby of some.<br /><br />And yet, I had most everything in Perl memorized in a way I never have with PHP... but now rather than googling to find my PHP bit on stackoverflow, I have ChatGPT. Like David Winer <a href="http://scripting.com/2024/01/17.html#a142417">put it</a>:<br /></p><blockquote>ChatGPT is like having a programming partner you can try ideas out on, or ask for alternative approaches, and they're always there, and not too busy to help out. They know everything you don't know and need to know, and rarely hallucinate (you have to check the work, same as with a human btw). It's remarkable how much it is like having an ideal human programming partner. It's the kind of helper I aspire to be.<br /></blockquote>or as Salvatore Sanfilippo <a href="http://antirez.com/news/140?ck_subscriber_id=2241809034">put it</a>:<br /><blockquote>In the field of programming, perhaps [LLM's] ability would have been of very little interest up to twenty or thirty years ago. Back then you had to know a couple of programming languages, the classic algorithms, and those ten fundamental libraries. The rest you had to add yourself, your own intelligence, expertise, design skills. If you had these ingredients you were an expert programmer, able to do more or less everything. Over time, we have witnessed an explosion of frameworks, programming languages, libraries of all kinds. An explosion of complexity often completely unnecessary and unjustified, but the truth is that things are what they are. And in such a context, an idiot who knows everything is a precious ally.</blockquote>I'm starting a new job and exploring new territory for me and I am grateful to have ChatGPT by my side.<br /><p></p>Kirk Ishttp://www.blogger.com/profile/15605658292036699663noreply@blogger.com0