An update on the new laptop

A couple of months ago I posted here about my new Samsung Galaxy Book 4 Edge laptop. I thought I should update you on my progress. I just posted the following to the Samsung support forum…..

I acquired a Galaxy Book 4 Edge laptop for personal use to explore the new features of Windows Copilot+ PCs. Unfortunately, I’m finding it difficult to do so because of strange issues with the GB4E. I’ve compared notes with colleagues who are using various Copilot+ laptops, and these issues occur only on the GB4E.

I enrolled the system in the Windows Insider program (Dev channel) to get early access to new Windows features like Recall. The Insider program uses the Feedback Hub application to track any problems. However when I try to start Feedback Hub, I am told that because I’m “on a tented project” I will need to sign in with an AAD account. Since this is a personal machine, I do not have such an account. I have no problems using Feedback Hub on any of my other computers.

Next, I tried to set up the new Recall application. When I did so, it instructed me to “Turn on device encryption” in Settings. I checked with System Information, and the GB4E supports device encryption. However, when I did so, I got a generic error message, and although device encryption appears to be activated, it advises me that “Some drives can’t be managed with device encryption.” The only drive on my system is the built-in SSD. When I restart Recall, it repeats the instruction to “Turn on device encryption.” 

What’s going on? What’s a “tented project”, and why is a GB4E part of one? My friends with Copilot+ laptops from Microsoft and HP are encountering none of these issues….

Maybe it was the propaganda, not the campaign

I’ve always been careful to play by the rules on Facebook. I’ve avoided posting the kind of content which has led some of my friends to have posts taken down or spend time in “FB jail”. However, over the last couple of months (yes, during the election season), I had several of my FB posts deleted for violating “guidelines“, for “trying to get likes“. The common element in all of these was that I posted a link to a piece in a mainstream media channel, with a short introduction on the subject and why I thought it mattered. (I never post bare links.) Well, we now know that this was a deliberate policy change by Meta to “depoliticize” FB. Of course, channels like X didn’t do anything of the kind (quite the reverse), and so the effect was to shift the overall sharing of content rightwards.

I’m also posting this on FB, and to avoid that FB policy I’ll use a comment to share the link to the Emptywheel piece that includes hard numbers to back up this claim. I’ll also link to this blog piece from Bluesky.

As Marcy points out, “If I’m right about that dynamic — that politics worked but propaganda worked far better — then it means much of the post-election soul-searching is misplaced (and, indeed, a dangerous misallocation of focus). That’s because Harris lost, in part, because of media disfunction, because electoral choice became dissociated from political persuasion more than any recent US election, largely due to an assault on the press and rational thought.”

Actions speak louder than words

It occurs to me that the political consequences of the failure of the Washington Post and LA Times to endorse Harris have been much more significant, and immediately positive, than their endorsements would have been. This is almost certainly unintentional, but let’s run with it.

If the papers had published their endorsements, it would have had almost no impact. It would have been predictable, unremarkable, and instantly forgettable. Instead, we have a dramatic and effective demonstration of the reality of the central message of the Harris campaign: that Trump is a fascist whose disrespect for the rule of law causes even oligarchs to bend the knee, and this event is reverberating widely. Actions speak louder than words: Bezos signaling that he is scared of Trump is much more potent than yet another politician using the “F” word.

From NYmag:

When Donald Trump first ran for president, he began to threaten that Amazon and Jeff Bezos would pay the price. “If I become president — oh, do they have problems. They’re going to have such problems,” he warned. Trump’s grievance with Amazon was centered on Bezos’s ownership of the Washington Post, a connection the president did nothing to disguise. […]

In 2019, Trump found his lever. Amazon was due to receive a $10 billion cloud-computing contract from the Pentagon. The Pentagon suddenly shifted course and denied Amazon the contract. A former speechwriter for Defense Secretary James Mattis reported that Trump had directed Mattis to “screw Amazon.”

This is the context in which the Post’s decision to spike its planned endorsement of Kamala Harris should be considered.

Some thoughts on video gaming

So I just played Skyrim for (probably) the last time. And that’s a big deal. Let me explain.

I love playing big, sprawling, open-world video games. I’m not very good at it, so I always play on the easiest settings, but I love the immersive experience of exploration, puzzles, discovery, and (yeah) combat. Like many others, the game that defined the genre for me was Skyrim, which came out in 2011. I played it for several years, on various devices (PCs, handhelds, consoles), and every playthrough I found something new. (Sometimes it had always been there, sometimes it had been added in a new edition of the game or a mod from the Creation Club.) And from Skyrim I moved on to the various Fallout and Assassin’s Creed games, and then Witcher 3, Horizon Zero Dawn, and the Ghost of Tsushima. After each of them, I found myself coming back to Skyrim (and sometimes to Fallout 4), even though the graphics and gameplay were looking increasingly dated. Familiarity, I guess.

Last year Bethesda, the team that created Skyrim, released Starfield. It was widely panned. I bought it, played it through, yawned, and went on to other things. This year, the first add-on for Starfield appeared. It wasn’t all bad, but it was disappointingly repetitive: the main quest, culminating in a manic “fight your way out of a collapsing ship/building before you die“, had been done to death (pun intended). Enough. Oh, and the graphics weren’t a patch on Horizon Forbidden West. (Just compare the Va’ruun Citadel with a Horizon cauldron.)

Waiting for a couple of games to be released in the fall, I decided to do another Skyrim playthrough. Basic game, no mods, take my time, pay attention to the little details I often overlook. Without mods, the game really showed its age, but it was clearly a much better, more immersive, more carefully crafted world than Starfield. And even though I was stopping to smell the roses (and found some charming details I’d previously missed), I still zipped through the early part of the game, becoming arch-mage at the college, and signing up with Ulfric’s Stormcloak rebellion.

And then it all went wrong. With the Stormcloaks, I attacked Whiterun and defeated the Jarl. I returned to Ulfric, and waited for my next mission. He wouldn’t give me one. I wondered if I’d forgotten to finish something at Whiterun, so I opened my map. Whiterun had disappeared. I travelled to a nearby town, and “walked” from there to Whiterun. Even though there was nothing on the map, the town was still there, and the Jarl was still on his throne. There was no sign of the conflict, no loose end I could clean up. I travelled back to Ulfric. He still wouldn’t give me a mission, nor would he initiate the attack on Whiterun. I was stuck. And I didn’t want to go back to a save from a few days ago and go through it all again.

Even after 13 years, it’s not surprising that a game like Skyrim still has major bugs… but I don’t have to put up with them. That bug destroyed the spell. There are so many better games to play, and I will. Bethesda’s simply lost the plot, and for Skyrim, Starfield, and Fallout are history. (And so, I fancy, are the Assassin’s Creed games.) When I get back from Massachusetts, I’m going to replay the remastered Horizon Zero Dawn, and dive back into Baldur’s Gate 3 – until Avowed appears, anyway.

Laptop update

Five years ago I was just starting a my new (and final) gig, Chief Open Source Officer for a startup, the (now defunct) Lacuna Technologies. I’d been working at Verizon, where my work laptop was a MacBook, but I knew that Lacuna would be a clean sheet. Hugh Martin (CEO) and I wanted to use Microsoft Office as the basis for the productivity environment, in part because we’d had mixed experiences with Google. I knew I was going to need a new work laptop, and that I’d be doing a fair amount of traveling, and so I picked up a Microsoft Surface Pro X, a 2-in-1 with an ARM CPU and a slot for an LTE SIM. It was risky, because Windows on ARM was still fairly experimental, but Microsoft was making all the right noises.

As it turned out, most of the people we hired into Lacuna wanted to use Google apps, so we abandoned Microsoft Office. And although the Surface Pro X was wonderfully light, I never found the 2-in-1 form factor very convenient. (It’s hard to use on your lap, unless you go for full tablet mode.) Also the reliance on x86 emulation meant that it was slow, and Microsoft did a lousy job of pumping up the Windows-on-ARM ecosystem. And finally the pandemic meant that I was working from home, and didn’t need mobile connectivity beyond what I could get from my phone.

When I retired from Lacuna, I stuck the Surface Pro X on the shelf, and replaced it with a maxed-out Dell laptop, which has served me well for the last two years. But I’ve always been an early adopter of new technology (remember Google Glass?), and so this week I decided to replace the Dell with something lighter. And I chose… an ARM-based Windows laptop! Specifically, I chose the 14 inch version of the Samsung Galaxy Book4 Edge, with a Snapdragon X Elite chip. It weighs 1.17 kg, compared with 1.07 kg for the Surface Pro X and 1.61 kg for the Dell, and it has a gorgeous 120 Hz OLED screen. (I’ve always loved Samsung’s displays.) Best of all, it was a great deal: $899 from Best Buy, $450 less than at the Samsung store. (Most of the current crop of so-called Copilot+ laptops are over $1,000, and, oddly, it’s not available from Amazon….)

I hesitated before buying, however, because I’d seen some conflicting reviews of the product concerning performance and battery life. It turns out that the culprit is Microsoft, which has unnecessarily complicated the configuration of power modes in Windows 11. This excellent video makes it…. well, not easy, but comprehensible.

So far I’m really enjoying the Book4 Edge. As usual, I’ve joined the Windows Insider program, so I’m curious to see how this generation of Windows laptops evolves. More anon.

Project frustration

I need to vent….

I’m working on what I hoped would be an easy little project. I fell in love with the replica computers created by Obsolescence Guaranteed, and decided to get a PiDP-11. Yes, I had worked with PDP-8s and a PDP-10 long before I got my hands on an 11, but there’s a lot more you can do with an 11, including running Unix V6 and V7! I knew my limitations, however (I’ve never been very good at soldering), and so I bought a pre-assembled model. All I needed to do was add a Raspberry Pi, and I’d be in business….

While I was waiting for the PiDP11 to arrive, I decided to familiarise myself with the SIMH software that underpins this and many other computer system simulations. I grabbed the nearest hacking laptop (which happened to have Fedora installed on it), and immediately ran into the tangle of issues that plagues so much open source. First, I learned about the split between SIMH and OpenSIMH. Then I found that the latest OpenSIMH on Github wouldn’t build (because you have to build from source – this is FOSS 🙄) because… something to do with CMake on Fedora. And everything seemed to depend on a bunch of shell scripts that looked like they’d only been tested on Ubuntu but didn’t work on Fedora. Sigh. I looked for pre-built binaries for Windows or Mac, failed, and decided to wait for the hardware.

My assembled PiDP-11 arrived, complete with a preloaded microSD card for the Raspberry Pi. I hooked up a nice new Pi 4 Model B to a screen and keyboard, booted it, and everything worked just fine. But before I installed the Pi into the PiDP-11 case, there was one more thing I needed to set up. I wasn’t planning to leave everything hooked up to an HDMI display and USB keyboard, so I needed a way to log in to the system via ssh over WiFi. So I created two new accounts on the Pi: one for remote admin, and one as a virtual terminal for the PiDP-11. I enabled WiFi and ssh on the Pi, booted it up, and successfully ssh‘d in from a Windows laptop. (I discovered that Windows Terminal now includes an ssh client – nice!) To find out the IP address of the Pi, I logged in to my Netgear Nighthawk router and looked at the list of connected devices. Excellent. All set. Except….

How could I be sure that the Pi would always get that IP address? DHCP hands out addresses on a first come, first served basis. Clearly I needed to assign a static address to the Pi. So I logged in to the Nighthawk, navigated to LAN Configuration, tried to assign an address, and failed with “Bad MAC address”. I consulted the Internet…. and discovered that the Nighthawk firmware has been broken for over a year, and the community has been yelling at Netgear to fix the Address Assignment bug, to no avail. 🤬

I really didn’t want to have to buy a new WiFi AP/router. But I think I’ll have to.

UPDATE: I decided to use the opportunity to upgrade our home WiFi to support WiFi 7. Future-proofing, I hope…..

A project in lieu of coursework

My friend Tom Lyon just dug up a piece put out by the University of Essex in 1970 to describe their new PDP-10. It also mentions that they were acquiring a PDP-15 for “research and teaching”. And that reminded me of an interesting project.

I applied to Essex in the spring of 1968, intending to study economics under Richard Lipsey. Lipsey was something of an enfant terrible in the field of “math-econ”, and I was really interested in his work on mathematical modelling. So I ignored the advice of my teachers at RGS to try for Cambridge, and was accepted at Essex. Rather than going straight up, I decided to do a “gap year” working at AERE Harwell, which introduced me to what was going to be my life-long work: computing. This meant that when I arrived at Essex in September 1969, I wasn’t too disappointed when I learned that Lipsey had resigned and decamped to his native Canada.

The academic structure at Essex was interesting. During my first year, I divided my time between mathematics, economics, and computer science. At the end of that year, I was required to pick a major, which would dictate my studies over the remaining two years. Obviously, I chose computing, which led to another decision. The computer science program included an “industrial experience” segment, which meant that the second year would comprise two terms at Essex followed by five months of work in “industry” (which would be graded).

In September 1970, I returned from my summer job at the Ministry of Technology to begin my second year. As I wrote elsewhere, there were several of us students who had almost as much experience as the academic staff, and in several of the courses it was going to be difficult to accommodate the wide range of abilities and experiences. Creativity ensued.

As mentioned above, Essex had just acquired a PDP-15. This was, perhaps, a poor choice. The department had several PDP-8s, and probably saw the PDP-15 as a nice step up from the limitations of a 12 bit architecture. In practice, the PDP-15 turned out to be a dead end. It was a TTL-based evolution of the family that included the PDP-1, PDP-4, PDP-7 and PDP-9. (IIRC, the electrical engineering department had a PDP-7 or PDP-9.) However, in 1970 DEC introduced the PDP-11, which spelt the end of the 18-bit architecture.

Nevertheless, the PDP-15 was chosen as the design which would be used to teach hardware architecture and machine language programming to undergraduates. And almost immediately a problem arose. The number of students studying computer science was growing rapidly, and it was going to be difficult for them all to get hands-on time on the PDP-15. So one of the staff (probably Dave Lyons) cornered several of us more experienced students and made a proposal. Rather than taking the machine language programming and hardware architecture courses, we could work together to develop a PDP-15 emulator to run on the PDP-10, so that future students would be able to use a virtual PDP-15 for their studies. Naturally, we would need accounts on the PDP-10 to do this work, and we’d be the only undergraduates with full access.

Four of us agreed enthusiastically. We carved up the project – TTY-based console UI, hardware emulation, resource management, and so forth. Everything was written in MACRO-10. I implemented the instruction set emulation, using the PDP-10’s elegant byte manipulation mechanisms to decode PDP-15 instructions. I don’t remember all of the details, but I think we hit our goal of booting up the DECsys operating system on the emulator in February 1971.

In April 1971, we all headed off to our “industrial experience” gigs. (I returned to Harwell, but that’s another story.) I think a couple of grad students were given the job of turning our project into a usable service for students. I hope it served them well.

A week of public transportation…

I just got home from a whirlwind week in the UK to visit family, friends, and colleagues. I was struck by the fact that everything depended on a diverse network of public transportation, and I thought it might be amusing to document all of the services I used during that week. So here goes. (This is going to be long….)

Wednesday February 7: I flew from Portland (PDX) to San Francisco (SFO), and then on to London (LHR). To get from home to PDX I used a rideshare service (Uber); the only realistic alternative would have been to drive and leave my car at the long-term car park.

Thursday, February 8: I arrived at Heathrow around noon. Passport control was swift and self-service: “scan my UK passport, gaze into the camera”. No checked baggage to retrieve, so I headed to my hotel in Pimlico. There are several ways to get from Heathrow to London, but I chose the oldest and cheapest: Piccadilly Line tube from Heathrow to Earls Court, and District Line from Earls Court to Victoria. My hotel was a 7 minute walk from Victoria. As with all of my bus and tube travel in London, I used my phone to pay; I didn’t have to buy tickets, or even install a special app. In fact, I didn’t use a (physical) credit card or cash during the entire trip. And of course(?), for the various travels which I’d booked in advance – United Airlines to London, Eurostar to Paris, GWR to Oxford, Avanti to Manchester – all of my tickets/boarding passes lived on my phone, either contactless or QR.

Friday, February 9: I had a business meeting in Paris, and so I was travelling by Eurostar train from London St. Pancras to Paris Gare du Nord. My train left at 7, but because of the passport control bullshit following Brexit you’re advised to arrive 60-90 minutes before your train. I could have used the Tube, but the service frequency at 5am is iffy, so I asked the hotel to call a taxi, which arrived in about 90 seconds. Sweet. Once I got in line at St. Pancras, I was invited to switch to the 6am train, which I did. Eurostar was very nice; I was traveling in Standard Premier class, and it was quiet and delightful.

Within Paris, I planned to take the Metro from Gare du Nord to Ecole Militaire using lines 4 and 8 (changing at Strasbourg Saint Denis). In preparation, I tried to set up for contactless payment, but this proved much more difficult than in London. (You have to install the IDF Mobilités app, which then asks you to install the My Travigo ticket management app.) The setup process failed several times, so in desperation I restarted my phone; that did the trick.

The return journey to London was uneventful, and I took the Victoria Line from St.Pancras to Pimlico to get back to my hotel.

Saturday February 10: The main event of the day (and the ostensible reason for the whole trip) was a family gathering in Blackheath, in south-east London. The party didn’t start until 1, so in the morning I spent some time in my favourite parts on London. I caught a 24 bus (my first ride on a New Routemaster) from just outside my hotel up to Leicester Square, and meandered towards Covent Garden and the London Transport Museum. Eventually I walked down The Strand to Charing Cross station and took a southeastern service out to Blackheath. After the (delightful!) family event, I walked back to Blackheath station and took another southeastern train to Victoria.

Sunday February 11: I may have been born in London, and have lived more than half my life in the US, but Oxford is one of my home places. So on Sunday I took a GWR train from Paddington to Oxford, to have lunch with an old friend and do a little shopping in Blackwell’s. I used the Circle Line to get from Victoria up to Paddington, which worked very well but triggered a momentary confusion, because I grew up in the era when the Circle Line was really a “circular” service, rather than today’s weird loop! The GWR service was on one of their 800-class bi-mode multiple units (it’s electric from London to Didcot, then switches to diesel power up to Oxford and beyond). The 800s are widely criticized; maybe my standards are too low, but I found it to be fast and comfortable. (It runs at around 120mph for much of the way, compared with 165mph for Eurostar.)

Monday February 12: I spent the day with my cousin, exploring the Victoria & Albert Museum and walking through Hyde Park.Travel was mostly Circle Line between Liverpool St and South Kensington, and bus from Marble Arch to Victoria.

Tuesday February 13: My plans for the day were straightforward: travel to Canada Water station (Victoria Line to Green Park, then Jubilee Line to Canada Water) to meet an old friend for lunch; then return to Victoria to meet a former colleague for dinner. I set off quite a bit earlier than I needed to, and as I approached Canada Water station I had an inspiration. I would stay on the train to North Greenwich, and then ride the cable car across the Thames. So I did. I can imagine that in good weather the views must be spectacular, but in the steady rain…. Definitely the weirdest bit of public transportation on this trip.

Wednesday February 14: Yet another “lunch meeting with an old friend” – but this time in Manchester. I took the Victoria line froim Pimlico to Euston, and boarded an Avanti Class 390 Pendolino for the 2 hr 6 min journey to Manchester. I decided to indulge myself by booking First Class, and the service was excellent. I met my friend at Manchester Piccadilly station, and since we are both transport nerds, we spent the time before lunch in a thoroughly appropriate fashion. We took a Metrolink tram out to Manchester Airport, and then rode on one of the new electric buses back into the city.

Thursday February 15: The last day of my trip, and there was one more transportation decision to make: how to get to Heathrow? There are three obvious choices: the Piccadilly Line (slow, cheap), the Heathrow Express from Paddington (fast, expensive), and the Elizabeth Line (moderately priced, new and interesting). I chose the Elizabeth Line, of course. I took the Circle up to Paddington and walked across to the new Elizabeth Line station. (I understand the infrastructure constraints, but the need to tap in and tap out twice seems less integrated than I’d expect.) The trains feel like an odd hybrid of tube and mainline design, with both longitudinal and transverse seating. But they’re comfortable and fast, and definitely better than the Piccadilly!

Then home. As on my outbound flight, the United 777 was only about half full, and I had a complete row to myself. Immigration was almost as fast as at Heathrow; I have Global Entry, so it was simply “look at the camera, flash my US passport to a human”. Parts of San Francisco airport seemed eerily empty. Since we’d got in quite early, I decided to try switch to the earlier SFO-PDX. I snagged the last seat – middle, of course, but at least it was Economy Plus. And then Uber home.

So that was the trip. Commercial air, Ubers at both ends in the US, one taxi in London, and otherwise 100% public transportation. (I think of all of the mainline rail services in England as “public transportation”, because even though some of the operators are commercial entities, they are required to operate in accordance with their franchise arrangements.) One minor ticketing glitch in Paris, but otherwise flawless. Tired, but happy, and so glad to have seen all my friends and family.

Hubris and AI

Tim Bray just posted a delightful piece on the state of LLM (large language model AI), and it got me thinking.

I think that one reason why people have been dismissive of AI (both the current efforts and the possibility) is that we humans tend to be convinced that our consciousness and capacity for reasoning is SPECIAL. Some of it’s religious (things like “souls”, “in g*d’s image”, etc.) and some is based on the fact that we tend to be awfully impressed by complex things we can’t understand. (I studied the philosophy of mind for a few years, and became increasingly frustrated by the way people would talk about “The Hard Problem” of consciousness.) But of course a lot of what we do and think is really mundane, and wouldn’t be much of a challenge to an orang-utan or a dolphin.

Funnily enough, I think that systems like GPT could well be useful deflators of our collective self-importance. People like Tim and I who have worked with Really Large Scale Systems are probably slightly ahead of the curve on this….

Doing my taxes….

I think there’s a lesson in here somewhere…

This week, I did our taxes. In the past, I’d handed that off to accountants, but we’ve simplified things and I was fairly confident that I could do it myself. The only complication was that we’d be filing jointly for Federal, but separately for State. No biggy, right?

I banged all the data into the basic TurboTax DIY web system, pulled stuff directly from banks and so forth, with no problems. But when it came to the State return, the system directed me to a dense page in the online help, which told me that I’d have to prepare two complete sets of returns – one joint, one separate – and file the Federal from the first and the States from the second. And because #reasons, I’d only be able to file by mail.

This sounded tedious, and error-prone. The obvious answer: upgrade to the premium TurboTax service, and let them do it for me. So I clicked Upgrade, and started a web chat with a Product Expert. (Not the Preparer, just a gatekeeper.) She told me I’d have to begin by uploading all my documents into the Checklist. “But I’ve already entered all the data, it’s in the system.” “Oh, no, we can only see the stuff that was imported electronically. None of the data that you typed yourself will transfer to the new return.” When I pointed out that this was crazy, she offered the following: “OK, I can drop you out of Full Service to let you get at the return you’d been working on. Then I suggest you screenshot each page. You can then return to Full Service and upload the screenshots, so that your Tax Preparer can enter the data.”

At first, I resisted. I tried to enter some of my data using the Full Service checklist, but it was completely useless. For example, there was a checklist item for Social Security income… but it only allowed you to enter one SSA-1099. If you have several, you’re SOL.

Reluctantly, I did what the Product Expert had suggested. I retraced my steps through the DIY web interface, and built up a folder containing 45 screenshots and 6 other documents, like photo IDs. I switched back to Full Service, and started to upload the files, to prepare for the call with my Tax Preparer.

After I’d uploaded 22 files, the system refused to upload any more.

First, I had a stiff drink. (I’ve moved from single malt Scotch to Bourbon recently.) Then I cancelled Full Service, and bought a copy of the Download/CD TurboTax Windows application. To my surprise, it successfully imported all the data I’d entered into the online service. I finished the whole Joint/Separate stuff in about an hour, with no significant issues. In part, this was because the UI of the Windows application was vastly superior to TurboTax’s web based services, and I had full control over the various versions of the return. The validation and issue resolution UX was crystal clear, highlighting each problem by showing the exact tax form fields involved. (In the online version, you have no visibility whatsoever into the relationship between the on-screen dialog and the resulting tax form entries.)

So what are the lessons here? It’s instructive that the only smooth data interchange occurred when I imported web data into the desktop app. I can imagine the desktop app team thinking, “These online services are likely to be problematic: let’s make sure we can always give the user a way of recovering by moving to the more reliable, old-school desktop environment.” Yes, there’s always a risk that the old desktop app will become an orphan in the world of web and mobile solutions, but I think TurboTax has managed to avoid that…. probably because accountants tend to be conservative. (And a good thing too!)