Wednesday, August 26, 2015

same, same, same

Dave Ellis points out the weird sameness striking corporate sites all across the land. There really is a horrible repetitiveness to it.

Some of it comes from some de-facto standardization on boostrap - or at least some libraries that look an awful lot like it. It goes hand-in-hand with the need for responsive design: the header, splash image, and columns can collapse into a single, thinner column readily, so little thought has to be put into it.

A long while ago I was noticing how certain sites had a certain "professional" look, and was trying to put my finger on exactly what it was. i came up with:
  • Nuances that move beyond just colored blocks: gradients, pinstripes, rounded edges. A lot of these are actually pains to pull off! And it probably does take some legitimate skill to do well.
  • Use of stock photography... people just sittin' around, looking like they're lives are somehow better with your product
  • There are certain layouts that are becoming defacto standards... all 3 of these sites use layouts based on a TV or billboard, with everything appearing above the fold. And no-one seems to use "flow layout", in general fixed columns are considered a better idea.
Plus ça change, plus c'est la même chose. Except it doesn't take that much skill to pull off now.

Tuesday, August 25, 2015

my procrastination jujitsu and the glory days of BASIC

If you're a fretting procrastinator - someone who puts off doing things because they're anxious about success with the next bit of whatever they should be working on - one useful skill to develop is re-channeling the energy pushing you away from one task into forward momentum on another. I call this "procrastination jujitsu".

This weekend I was kind of feeling angst-y about choosing a box2d js wrapper and learning it well enough to recreate some games I developed using a simpler physics system. I took that fret-energy and started a project I'd been meaning to get to for a long time: Gazette Galore, mini reviews of every game published on disk by COMPUTE!'s Gazette, a Commodore-home computer magazine that ran from the mid-80s to the mid-90s.

The blog came together pretty well. I've gotten over my "Not Invented Here" syndrome and have been happy enough using blogspot for this blog that it made sense to apply it for this. A nice feature too, is: my Gazette Blog has an ending, unlike this devblog that might just go on and on and on and on.

Anyway, this particular magazine was important to my young geek self. I was lucky enough to get a pile of magazines and accompanying disks when I inherited my Uncle Bill's C=64. As I read through the old magazines I'm surprised at how some of the game articles don't just describe how to play the game, they get into how it was made, and some of the programming Tips and Tricks takeaways. The whole magazine is much more programmer-centric than I remembered - computers of this era really encouraged the hobbyists to get in there and make something.

Booting into BASIC, even if 98 times out of 100 a kid just used it to boot into a game, was such a nice visibly open door to coding... many kids in that age would have been taught a little programming at school, even if it was of the
30 PRINT A$;
40 " IS GREAT!!! ";
50 GOTO 30
variety. (That's the expanded version from the one that just prints KIRK IS GREAT )

I think about how to introduce my nephews and nieces into coding, since it's both such a pleasant and creative activity, and it potentially leads to one of the few careers I can heartily recommend. There are some cool things like MIT's Scratch, and I'll be reading Seymour Papert's "Mindstorms" (focusing on the glory of LOGO) as well as trying to spread the good word of Processing, but it's just not the same. 8-bit built-in BASIC felt like it connected to everything the machine knew how to do, more or less; everything else seems to be in these little walled gardens...

Friday, August 21, 2015

xpath fun

At work we use Robot and Selenium to do automated testing. Most of the page selectors use xpath; at first I didn't like that (vs using css selectors), because it seemed like another thing to learn without adding value (I mean most of us know CSS already but this is the only place we were using xpath) but the fact that xpath can do a contains(text(),'Some Text') makes it worth its conceptual weight vs CSS.

In one of our tests, a menu clicker like
was failing because the matching item wasn't visible, even though to the casual observer it certainly seemed visible, with the parent menu open on the page.

The previous solution was to have the robot test close the browser and re-navigate to the page... obviously overkill, and it worked, but we weren't sure why.

It turned out that the ext.js code was creating a new instance of the menu without destroying the old one as robot navigated back and forth through screens in this one-page app- it was just hiding the old menu ("display:none")

We probably should hunt that code down in the source and fix it, but in the meanwhile
finds the last and relevant instance to click on. Although this last() required some extra parens (to make sure we were finding the last of the matching set, and not indicating we wanted the something that was the last among its siblings in the DOM) it seems easier than rigging up an xpath to indicate "the visible one" - that gets complicated.

Another thing I learned is that Firefox default developer tools console gives you a $x() function "for free" that can evaluate xpath expressions... so when
$x( "(//span[@class='x-menu-item-text'][contains(text(),'Flights')])[last()]" ).length
in the console returned 1 instead of 2 or 0, we knew we had finally constructed the xpath properly, and we were able to iterate on variants of the syntax much more quickly than if we were waiting for robot to churn through the test.

Of course, one additional takeaway (especially relevant for the unwary UI developer turned QA engineer) is the empirical observation that Robot's use of xpath selectors is very different from jQuery's use of CSS selectors when multiple items are hit; much of jQuery's power is effortlessly applying operations to all matching nodes, while evidently Robot will just take the first match in the set.

Our Robot tests also were made excessively brittle by long, exact match classpaths:
//div[@class="classa classb classc classd"]/em/button
A test was failing because the order of the classes applied to the element had changed.
The best work around seems to be picking one (or two) of the most relevant classes and use contains for a partial match :
//div[contains(@class,"classa") and contains(@class,"classb")]/em/button

Wednesday, August 19, 2015

sound effects in processing to processing.js conversions.

Over the past decade I wrote a lot of games and toys in Processing. When I ran into the original IDE/Applet generating kit in 2004, it was empowering. Visual Basic is the only other language that comes to mind in my developer history as a powerful, flexible, easy-to-learn toolkit for making fun interactive things that I could share with others.

Of course, Java Applets (Java programs run in the browser) have gone the way of the Dodo, and so for an upcoming project I've been looking into converting almost all of my old games into something that runs in modern browsers. Fortunately, there was Processing.js, a freak of nature that let modern javascript browsers run legacy Java sketches. I'd written about some of the porting gotachas 3 years ago. The biggest hurdles remaining are Sound and 2D Physics.

I usually used the "minim" library for sound in Processing. I thought the easiest replacement would be my own lowLag library. It would also be cool if I could keep the "dual mode" nature of processing, where I can toggle between Java and browser friendly "Javascript" mode without recoding. 

Net-net is I wrote this little bit of javascript that exposes lowlag functionality in a way that gets exposed identically as the old minim objects to Processing code:
window.AudioPlayer = function(f){
this.file = f;
this.rewind = function(){}; = function(){;
window.Minim = function(){
this.loadFile = function(file){
return new AudioPlayer(file);
this.loadSnippet = this.loadFile;

(yeah, I'm ignoring "rewind", and treating Audio Files the same as Snippets.)

The other trick was getting the Processing IDE to respect these files. So I could edit and then run in the browsers, without always manually hacking the index.html files etc. Asking about that on the Processing Forums the best bet seems to be twofold:
1. Update ~/Documents/Processing/modes/JavaScriptMode/template/template.html so that the generated index.html has a reference to the .js file
2. if you don't want to refer to the .js files on a known host, put the .js files in the same working directory and they'll be automagically copied into the web-export folder the IDE kicks out when you "run" a Processing sketch in Javascript mode. However, this borks toggling between Java and Javascript mode (when you switch to js mode it reads the pure .js files as editable tabs, and then when you try and go back to Java mode it refuses, opening up a blank window instead.

From that forum, I also learned that it transpiled Processing is being deprecated in favor of p5, which provides the classic Processing API in code to be called natively by your js. Transpiling was always such an odd beast that I don't really mind moving on... and I guess I shouldn't feel to bad about the hacks I do to support my legacy Java/JS cross-compiling modes.

Next step: finding a good 2D Physics solution. I had done some cool things Jean Maxime Coulliard's PPhys2D, now defunct- but it had an awesome learning section. I need to find a good wrapper for Box2D, and may end up porting those things to P5.

Friday, August 14, 2015

the fire and motion of javascript frameworks

Joel on Software's Fire and Motion essay has been in my head for over a decade.

In it he describes two phenomenon: one is the trouble developers sometimes have getting the actual drudgery of coding underway, and then how unstoppable they can feel when they really get on a roll. (I've heard it likened to a giant rolling stone... tough to get started, but also capable of enormous momentum once it's under way.)

He also describes this as a deliberate strategy companies can employ, keeping the number of checkbox technologies needed so that rivals are always kept with their heads down, meeting new checkboxes, and never able to really fire back.

Sometimes it feels like the famously large mess of javascript frameworks and supporting tools has reached that point. I don't know if it's deliberate or a side-effect of a ton of people with bad cases of NIH (Not Invented Here) and premature abstraction, but it makes my professional life a lot more fraught feeling than I think it would otherwise.

(Random other, newer article from Tal Bereznitskey: 7 lessons Soccer taught me about management)

Thursday, August 13, 2015

life without jQuery

I've finally turned my attention back to my lowLag low latency HTML5 audio project... I've been neglecting it for far too long, especially considering it's the strongest example of a successful opensource project in my portfolio.

One upgrade is getting rid of the dependency on jQuery. (Especially since I plan to be doing a lot of work in processing.js, and need both libraries seems silly.)

You Might Not Need jQuery is a tremendously useful page for this, showing the vanilla-javascript version of jQuery phrases, but it favors conciseness over completeness.

The biggest difference living without jQuery's tremendous merge of CSS-style selectors with general attribute and style wrangling is living without jQuery's implicity "foreach"; you can do an operation against a selector that matches zero, one, or many objects, and jQuery will do what you expect. In pure Javascript land, however, the zero case can be prone to null errors if one isn't careful, and any loop you have to build yourself.

For my lowLag object, I found it useful to add the following convenience functions:
this.createElement = function(elemType,attribs){
var elem = document.createElement(elemType);
for(var key in attribs){
return elem;
this.safelyRemoveElement = function(elem){
if(elem) elem.parentNode.removeChild(elem);
this.safelyRemoveElementById = function(id){

So I replaced code like
$(body).append("<div id='lowLag'></div>");
with the somewhat clunkier
var divLowLag = this.createElement("div",{"id":"lowLag"});

safelyRemoveElement points out how jQuery's $("some selector").remove(); covers up how you really are modifying the parent of an element when deleting the child.

Monday, August 10, 2015

vidyo and #uifail

My company uses a video chat product called "Vidyo". On the desktop it works pretty well, and we also invested in some hardware for various conference rooms, which is a bit more uneven. (I think the hardware comes from various other companies)

One incarnation of the dedicated hardware install has a large TV with a menu display like this:

It's clunky but not terrible. It's designed for generic hardware with a phone-like numberpad and an arrowkeys crosspad but no alphanumeric keyboard, so you press a button, and then a vaguely T-9 like menu pops up, and you press the direction you meant.

But you know, not bad use for limited hardware input. But now in the other room there are nice little desktop unit with touch screens! We can put this awkward and clunky UI stuff away and just type, right?

Nope! They use their lovely touch screen as a number pad, and a crosspad of directional buttons. Making things even weirder is how the "OK" button is actually in the middle of the virtual crosspad, took me a minute to figure that out.

And it's not like the product suite is a beacon of strict UI uniformity... some of the other dedicated hardware units have a very different UI:
So you can see there's a virtual onscreen keyboard... being operated by the remote's arrow keys.

So, the touch screen is used like low-key-count keyboard, and the remote is used to laboriously navigate over an onscreen keyboard. Brilliant! (actually I think a virtual keyboard is much easier for people to use than that weird T-9-like system, and I'd say about as fast except for super hardcore power users.)

Tuesday, August 4, 2015

voice to text thought

I notice that the mistakes OSX's Voice to Text makes are pretty dramatic; entire words get changed or ignored, or sometimes entire mini-phrases emerge that weren't there at first. (It makes "Damn You Autocorrect" seem pretty innocent, in comparison.)

It makes me think there should be a special kind of quote mark - maybe emoji based? That would demarcate voice to text'd writing. (I think of the way some business mail used to say "dictated but not read")