Back in 1992, Chris Columbus was either the guy who directed "Home Alone" or the guy who "discovered" America. And that year, Giorgetto Giugiaro's Italdesign released a wild concept vehicle called The Columbus (in memory of the latter CC).
The "ultra-high level status vehicle" was meant to celebrate the 500th anniversary of Christopher Columbus' voyage to America (before that came to be seen as problematic). The minivan's swoopy, nautical style lines say Nina, Pinta and Santa Maria more than they do Dodge Caravan.
"Mini" van might be the wrong term, though that's what Italdesign called it. The vehicle was actually 6 meters (19.7') long and seated up to nine people.
The driving position is elevated for better visibility, and the engine—a 5-liter BMW V-12—was beneath the driver's position. Startlingly for the time, the driver's seat was in the center of the vehicle. (The McLaren F1, which also adopted this arrangement, wasn't released until the following year.)
I rewatched Wes Anderson’s 2009 film Fantastic Mr. Fox again and I have to say that it might be my very favorite of his films. Now, I get it if you don’t like Anderson. It’s an aesthetic alright and if it isn’t yours, you probably find him unbearable. As for the films being all the same, well, that’s not uncommon of many directors, so I don’t have a lot of patience for that argument. It’s just that you don’t like the films being all the same in this kind of way. Again, that’s OK. I admit that his whimsy and put-ons have varying degrees of effectiveness over a now pretty long career. But I do like it when Anderson turns to animation and his adaptation of Roald Dahl’s story is pretty great. For one, George Clooney is absolutely perfect as the fox. The combination of ridiculousness, arrogance, and male ego works very well and Clooney can really deliver that. Meryl Streep as Mrs. Fox is almost as good, though she’s asked to do less. Of course Jason Schwartzmann, Bill Murray, and Owen Wilson show up doing basically their normal thing in a different role. I am amused at Brian Cox as the reporter. Willem Dafoe as the rat is great too. And I happen to love the animation, though I confess to being no expert on the subject and having not nearly as much knowledge as anyone who cares about the topic in any more than a passing way. I’m surprised this didn’t make a ton of money, though it did generate a small profit at the time that has grown over the years. Anyway, it’s highly enjoyable. After all, we are all just wild animals.
From the Beetle to the Rabbit to the Golf, Volkswagen has long made affordable, practical cars that those with smaller budgets can afford. Now they're aiming that prowess at the EV market with the unveiling of their ID EVERY1.
The concept car is intended to make the transition into production for €20,000 (USD $21,667) in 2027. Hailed as "affordable entry-level all-electric mobility" by Thomas Schäfer, CEO of Volkswagen Passenger Cars, the 94hp vehicle will reportedly have a range of 155 miles.
Aesthetically, the vehicle was designed to have a friendly, approachable look. "Our ambition was to create something bold yet accessible," says Andreas Mindt, Volkswagen's Head of Design. "The ID EVERY1 has a self-assured appearance but remains likeable – thanks to details such as the dynamic front lights and the 'smiling' rear. These design elements make it more than just a car: they give it character and an identity that people can relate to."
The big question for Americans is whether that €20,000 sticker will apply in 2027, given the way our current administration's tariff war is going. The point may be moot; sadly, VW has announced no plans to bring this affordable EV to the U.S. market. With any luck things will change in two years' time.
I was talking to a friend about how to add a directory to your PATH today. It’s
something that feels “obvious” to me since I’ve been using the terminal for a
long time, but when I searched for instructions for how to do it, I actually
couldn’t find something that explained all of the steps – a lot of them just
said “add this to ~/.bashrc”, but what if you’re not using bash? What if your
bash config is actually in a different file? And how are you supposed to figure
out which directory to add anyway?
So I wanted to try to write down some more complete directions and mention some
of the gotchas I’ve run into over the years.
If you’re not sure what shell you’re using, here’s a way to find out. Run this:
ps -p $$ -o pid,comm=
if you’re using bash, it’ll print out 97295 bash
if you’re using zsh, it’ll print out 97295 zsh
if you’re using fish, it’ll print out an error like “In fish, please use
$fish_pid” ($$ isn’t valid syntax in fish, but in any case the error
message tells you that you’re using fish, which you probably already knew)
Also bash is the default on Linux and zsh is the default on Mac OS (as of
2024). I’ll only cover bash, zsh, and fish in these directions.
step 2: find your shell’s config file
in zsh, it’s probably ~/.zshrc
in bash, it might be ~/.bashrc, but it’s complicated, see the note in the next section
in fish, it’s probably ~/.config/fish/config.fish (you can run echo $__fish_config_dir if you want to be 100% sure)
a note on bash’s config file
Bash has three possible config files: ~/.bashrc, ~/.bash_profile, and ~/.profile.
If you’re not sure which one your system is set up to use, I’d recommend
testing this way:
add echo hi there to your ~/.bashrc
Restart your terminal
If you see “hi there”, that means ~/.bashrc is being used! Hooray!
Otherwise remove it and try the same thing with ~/.bash_profile
You can also try ~/.profile if the first two options don’t work.
(there are a lot of elaborate flow charts out there that explain how bash
decides which config file to use but IMO it’s not worth it to internalize them
and just testing is the fastest way to be sure)
step 3: figure out which directory to add
Let’s say that you’re trying to install and run a program called http-server
and it doesn’t work, like this:
$ npm install -g http-server
$ http-server
bash: http-server: command not found
How do you find what directory http-server is in? Honestly in general this is
not that easy – often the answer is something like “it depends on how npm is
configured”. A few ideas:
Often when setting up a new installer (like cargo, npm, homebrew, etc),
when you first set it up it’ll print out some directions about how to update
your PATH. So if you’re paying attention you can get the directions then.
Sometimes installers will automatically update your shell’s config file
to update your PATH for you
Sometimes just Googling “where does npm install things?” will turn up the
answer
Some tools have a subcommand that tells you where they’re configured to
install things, like:
Node/npm: npm config get prefix (then append /bin/)
Go: go env GOPATH (then append /bin/)
asdf: asdf info | grep ASDF_DIR (then append /bin/ and /shims/)
step 3.1: double check it’s the right directory
Once you’ve found a directory you think might be the right one, make sure it’s
actually correct! For example, I found out that on my machine, http-server is
in ~/.npm-global/bin. I can make sure that it’s the right directory by trying to
run the program http-server in that directory like this:
$ ~/.npm-global/bin/http-server
Starting up http-server, serving ./public
It worked! Now that you know what directory you need to add to your PATH,
let’s move to the next step!
step 4: edit your shell config
Now we have the 2 critical pieces of information we need:
Which directory you’re trying to add to your PATH (like ~/.npm-global/bin/)
Where your shell’s config is (like ~/.bashrc, ~/.zshrc, or ~/.config/fish/config.fish)
Now what you need to add depends on your shell:
bash instructions:
Open your shell’s config file, and add a line like this:
export PATH=$PATH:~/.npm-global/bin/
(obviously replace ~/.npm-global/bin with the actual directory you’re trying to add)
zsh instructions:
You can do the same thing as in bash, but zsh also has some slightly fancier
syntax you can use if you prefer:
path=(
$path
~/.npm-global/bin
)
fish instructions:
In fish, the syntax is different:
set PATH $PATH ~/.npm-global/bin
(in fish you can also use fish_add_path, some notes on that further down)
step 5: restart your shell
Now, an extremely important step: updating your shell’s config won’t take
effect if you don’t restart it!
Two ways to do this:
open a new terminal (or terminal tab), and maybe close the old one so you don’t get confused
Run bash to start a new shell (or zsh if you’re using zsh, or fish if you’re using fish)
I’ve found that both of these usually work fine.
And you should be done! Try running the program you were trying to run and
hopefully it works now.
If not, here are a couple of problems that you might run into:
problem 1: it ran the wrong program
If the wrong version of a program is running, you might need to add the
directory to the beginning of your PATH instead of the end.
For example, on my system I have two versions of python3 installed, which I
can see by running which -a:
$ which -a python3
/usr/bin/python3
/opt/homebrew/bin/python3
The one your shell will use is the first one listed.
If you want to use the Homebrew version, you need to add that directory
(/opt/homebrew/bin) to the beginning of your PATH instead, by putting this in
your shell’s config file (it’s /opt/homebrew/bin/:$PATH instead of the usual $PATH:/opt/homebrew/bin/)
export PATH=/opt/homebrew/bin/:$PATH
or in fish:
set PATH ~/.cargo/bin $PATH
problem 2: the program isn’t being run from your shell
All of these directions only work if you’re running the program from your
shell. If you’re running the program from an IDE, from a GUI, in a cron job,
or some other way, you’ll need to add the directory to your PATH in a different
way, and the exact details might depend on the situation.
in a cron job
Some options:
use the full path to the program you’re running, like /home/bork/bin/my-program
put the full PATH you want as the first line of your crontab (something like
PATH=/bin:/usr/bin:/usr/local/bin:….). You can get the full PATH you’re
using in your shell by running echo "PATH=$PATH".
I’m honestly not sure how to handle it in an IDE/GUI because I haven’t run into
that in a long time, will add directions here if someone points me in the right
direction.
problem 3: duplicate PATH entries making it harder to debug
If you edit your path and start a new shell by running bash (or zsh, or
fish), you’ll often end up with duplicate PATH entries, because the shell
keeps adding new things to your PATH every time you start your shell.
Personally I don’t think I’ve run into a situation where this kind of
duplication breaks anything, but the duplicates can make it harder to debug
what’s going on with your PATH if you’re trying to understand its contents.
Some ways you could deal with this:
If you’re debugging your PATH, open a new terminal to do it in so you get
a “fresh” state. This should avoid the duplication.
Deduplicate your PATH at the end of your shell’s config (for example in
zsh apparently you can do this with typeset -U path)
Check that the directory isn’t already in your PATH when adding it (for
example in fish I believe you can do this with fish_add_path --path /some/directory)
How to deduplicate your PATH is shell-specific and there isn’t always a
built in way to do it so you’ll need to look up how to accomplish it in your
shell.
problem 4: losing your history after updating your PATH
Here’s a situation that’s easy to get into in bash or zsh:
Run a command (it fails)
Update your PATH
Run bash to reload your config
Press the up arrow a couple of times to rerun the failed command (or open a new terminal)
The failed command isn’t in your history! Why not?
This happens because in bash, by default, history is not saved until you exit
the shell.
Some options for fixing this:
Instead of running bash to reload your config, run source ~/.bashrc (or
source ~/.zshrc in zsh). This will reload the config inside your current
session.
Configure your shell to continuously save your history instead of only saving
the history when the shell exits. (How to do this depends on whether you’re
using bash or zsh, the history options in zsh are a bit complicated and I’m
not exactly sure what the best way is)
a note on source
When you install cargo (Rust’s installer) for the first time, it gives you
these instructions for how to set up your PATH, which don’t mention a specific
directory at all.
This is usually done by running one of the following (note the leading DOT):
. "$HOME/.cargo/env" # For sh/bash/zsh/ash/dash/pdksh
source "$HOME/.cargo/env.fish" # For fish
The idea is that you add that line to your shell’s config, and their script
automatically sets up your PATH (and potentially other things) for you.
This is pretty common (for example Homebrew suggests you eval brew shellenv), and there are
two ways to approach this:
Just do what the tool suggests (like adding . "$HOME/.cargo/env" to your shell’s config)
Figure out which directories the script they’re telling you to run would add
to your PATH, and then add those manually. Here’s how I’d do that:
Run . "$HOME/.cargo/env" in my shell (or the fish version if using fish)
Run echo "$PATH" | tr ':' '\n' | grep cargo to figure out which directories it added
See that it says /Users/bork/.cargo/bin and shorten that to ~/.cargo/bin
Add the directory ~/.cargo/bin to PATH (with the directions in this post)
I don’t think there’s anything wrong with doing what the tool suggests (it
might be the “best way”!), but personally I usually use the second approach
because I prefer knowing exactly what configuration I’m changing.
a note on fish_add_path
fish has a handy function called fish_add_path that you can run to add a directory to your PATH like this:
fish_add_path /some/directory
This is cool (it’s such a simple command!) but I’ve stopped using it for a couple of reasons:
Sometimes fish_add_path will update the PATH for every session in the
future (with a “universal variable”) and sometimes it will update the PATH
just for the current session and it’s hard for me to tell which one it will
do. In theory the docs explain this but I could not understand them.
Hopefully this will help some people. Let me know (on Mastodon or Bluesky) if
you there are other major gotchas that have tripped you up when adding a
directory to your PATH, or if you have questions about this post!
Gene Hackman, who never fit the mold of a Hollywood movie star, but who became one all the same, playing seemingly ordinary characters with deceptive subtlety, intensity and often charm in some of the most noted films of the 1970s and ’80s, has died, the authorities in New Mexico said on Thursday. He was 95.
Mr. Hackman and his wife were found dead on Wednesday afternoon at a home in Santa Fe., N.M., where they had been living, according to a statement from the Santa Fe County Sheriff’s Department. Sheriff’s deputies found the bodies of Mr. Hackman; his wife, Betsy Arakawa, 64; and a dog, according to the statement, which said that foul play was not suspected.
Ok. So, there are circumstances.
Hackman is, I think, my favorite actor of the 70s generation, which obviously is an extremely tough crowd to get ahead in. I listen to Smartless religiously, especially the eps that interview actors, and it’s fascinating to listen to how people in the business now have a combination of fear and reverence for that generation. Hackman carried an easy combination of wisdom and menace, which he could leverage from role to role. My favorites list is tough, but in no order:
The Conversation
Royal Tenenbaums
French Connection
Unforgiven
Get Shorty
The Quick and the Dead
The Firm
No Way Out
Superman I & II
Night Moves
I appreciate that’s an awfully idiosyncratic list but he had an idiosyncratic career.
Some clips:
Photo credit: By Christopher Michael Little, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=10689828
In 1880, Edward Cabot Clark embarked on a risky endeavor--the erection of the first-class apartment house on Central Park West that he would call The Dakota. His challenge was convincing potential residents that a multi-family building--considered at the time middle-class at best--could be appropriate to wealthy families. His solution was to provide apartments that were equal to upscale private homes.
The following year, eight millionaires formed the Berkshire Apartment Association to erect another high-end apartment building, The Berkshire. Among the shareholders were Alexander Guild; Fletcher Harper, of publishing firm Harper Brothers; and Edward M. Shepard, of the furniture company Stickney & Shepard. Carpentry and Building explained in September 1881, that it would be a co-operative and said, "It is expected that each of these shareholders will occupy one of the apartments in the Berkshire when the building is completed, and this apartment will be his property permanently." The other apartments, said the article, "are to be rented out."
The syndicate hired German-born architect Carl Pfeiffer to design the building. The nine-story structure was completed in 1883. Pfeiffer's tripartite, Queen Anne-style design sat upon a two-story granite base. The midsection was faced in "Croton pressed brick, with stone, terra cotta, and molded brick ornamentation," according to Carpentry and Building. It featured picturesque, paneled bays with curved sides. The top section harkened to 17th century England or Germany with half-timbering, gables and prominent chimneys. The flat roof was paved with tiles and "hanging gardens of flowers in ornamented boxes" lined the edges. Carpentry and Building said that from this "lofty promenade," residents could enjoy views of "Long Island, Long Island Sound, and the Palisades of the Hudson."
American Architect & Building News, August 3, 1883 (copyright expired)
The main entrance on Madison Avenue was accessed above a short, doglegged stoop. Servants and tradespeople used an entrance in the courtyard at the rear.
Complex Queen Anne-style upper panes included stained glass inserts. American Architect & Building News, December 22, 1888 (copyright expired)
The Berkshire held 17 apartments, two each per floor through the seventh, and three on the eighth. Each resident also had a second servant's room and a trunk room on the ninth floor. Carpentry and Building explained, "Each of these apartments will consist of a library, a dining room, a parlor, a kitchen, a bath-room, a laundry, a servants' room, abundance of closet room, and four bedrooms."
In the lobby was a marble staircase with "railings of colored marble," according to The American Architect and Building News on August 4, 1883, and two elevators--one for the residents and the other for servants. The gas fixtures in the apartments were designed to be easily converted to electricity in the future. The article said, "Every convenience known to modern improvement will be introduced in the house, which is intended to rival the Paris palais in elegance and comfort."
The basement level was outfitted for the janitor's apartment and rental offices for physicians. The owners of the Berkshire assured that there would be no offensive trashcans and odors. In the cellar was "an apparatus for cremating the refuse of the kitchen," said Carpentry and Building, which added, "No slop-barrels are to disfigure the sidewalk in front of the Berkshire. The refuse will all be dried by steam and then burned."
A typical floorplan, with two apartments per floor--left and right. The American Architect and Building News, January 17, 1891 (copyright expired)
Elevators in the 1880s, of course, did not have the safety measures we take for granted today. Many of them resembled ornate birdcages, their openwork structures presenting dangers to passengers wearing Victorian garments. On November 18, 1887, Winifred Egan visited a friend at the Berkshire. She never made it to the apartment. The Sun reported that she died from injuries resulting, "by having her dress caught in passing one of the floors while in the elevator." On January 20, 1888, a coroner's jury, "censured the proprietors of the house for employing an incompetent elevator boy."
Carl Pfeiffer's Queen Anne design carried into the interior, as well. American Architect & Building News, August 4, 1883 (copyright expired)
An early resident of the Berkshire was millionaire William Marsh Rice and his wife, the former Julia Elizabeth Baldwin, who went by her middle name. Elizabeth was Rice's second wife, the first having died. (Interestingly, Elizabeth's sister was the wife of William's brother, Frederick Rice.)
William Marsh Rice, from the collection of Rice University
Born in Springfield, Massachusetts in 1816, Rice's Horatio Alger-type story began as a grocery store clerk. By 1860 he was reportedly the wealthiest man in Houston, Texas, the owner or part-owner of real estate holdings, lumber firms, railroads, and cotton concerns. He and Elizabeth moved to New York in 1882. Their 160-acre country estate was in Dunellen, New Jersey.
According to historian J. T. McCants in his 1955 article, "85 Years of Capitalism: The Story of William M. Rice," the Rices' marriage "was stormy" by the 1890s. According to McCants, around 1892, Elizabeth "consulted an attorney, A. G. Allen, about a divorce." She would never obtain that divorce. A common method of removing a bothersome family member at the time was to have them committed. McCants said Elizabeth, "died in Waukesha, Wisconsin on 24 July 1896 hopelessly insane."
At the time of Elizabeth's death, Rice's estate was estimated at "about four million dollars," according to McCants. (The figure would translate to about $150 million in 2025.) His will, executed in 1891, directed that his massive fortune should be used to found the William M. Rice Institute for the Advancement of Literature, Science and Art in Houston, Texas.
Rice's valet, Charles F. Jones, had been with him since his Texas years. The cherished servant discovered the multimillionaire dead in his bed on September 23, 1900. The death certificate declared his demise the result of "old age and extreme nervousness." But three days later, The New York Times revealed the first hint that officials were suspicious in reporting, "His body was to have been cremated yesterday morning, but instead after funeral services had been held at the house...it was taken to the Morgue and an autopsy was performed."
American Architect & Building News published this depiction of a Berkshire parlor in August 4, 1833 (copyright expired)
The autopsy revealed that Rice "died of arsenical and mercurial poisoning," reportedThe Evening World on October 27. The case had proceeded rapidly and the article disclosed that Charles F. Jones, the trusted valet, and Albert T. Patrick, Elizabeth Rice's former lawyer, had been arrested for murder.
The trial, which became one of the most sensational for decades, revealed that Patrick had forged a new will that left a large portion to himself, and had persuaded Jones to assist in the murder. On June 9, 1905, The Evening World ran a banner, all-caps headline, "PATRICK TO DIE IN THE CHAIR." (Instead, however, in 1906 his sentence was commuted by Governor Frank Higgins and in 1912 he received a full pardon from Governor John A. Dix. Charles F. Jones was not charged.)
William Marsh Rice's fortune, as he intended, was used to found the William Marsh Rice Institute, known today as Rice University.
A ground-floor apartment was advertised for rent in October 1904. The advertisement described, "parlor, library, dining room, 3 family bedrooms, 3 servants' bedrooms, kitchen, etc.," The rent was $4,500 per year--a significant $13,250 per month in 2025 terms.
Among the well-heeled residents of the Berkshire at the time was stockbroker Franklin William Gilley. Born in 1840, he was elected to the Stock Exchange in 1864. Gilley was a member of F. W. Gilley, Jr. & Co. and had been treasurer of the New York Stock Exchange since 1895.
By the second decade of the 20th century, many of the mansions in the Madison Avenue neighborhood had been razed for commercial buildings. The Berkshire was now an architectural anachronism. On August 17, 1913, The Sun mentioned that "the apartment house known as the Berkshire, at 500 Madison avenue...is now being altered and modernized." And, indeed, it was. The renovations stripped the Berkshire of its charming Queen Anne personality. Without the oriels, gables and half-timbering, it looked as it had been erected a year earlier.
image via the NYC Dept of Records & Information Services.
For about a decade, however, the building continued to be home to wealthy families. Their names routinely appeared in the society pages that reported on their travels, debutante entertainments, dinner parties and weddings. Then in 1925 the Berkshire was converted to an upscale residential hotel.
An advertisement in Town & Country on November 1, 1926 said, "to each and every heckled, non-plussed householder--The Berkshire will prove a revelation." The ad continued,
Never, does the cook "take a day off"...Never, does Basil, the butler, decide to locate in Chicago to be near his aunt...Never is it necessary to dismiss Marie for impertinence...Never, in fact, do any of those things that heckle and non-plus householders occur...An Arcadian town-house--The Berkshire.
The suites, "as large or small as you wish," were available either unfurnished or furnished. The furnished apartments had been decorated by B. Altman & Co. The ad stressed, "And everything--maid and valet service; electric light and refrigeration; meal service in your own rooms, is included in your rental!"
The Berkshire survived until 1953, when it was replaced by a 19-story and penthouse apartment house.
So it’s all fine and good to have this protest action, but I have a question.
Why is it that protest actions these days almost always advertise themselves as nonviolent?
Let’s be clear, I am most certainly not advocating for violence, which would almost certainly be stupid and counterproductive. I am however curious as to the process by which we fetishize nonviolence to the point that we have to define any political action as nonviolent upfront. Even leaving behind the fact that Martin Luther King was a gun owner and that guns were central to self-defense in the civil rights movement, it’s still kind of weird despite the bad history behind it. I am trying to wrap my brain around this.
It’s almost as if protestors today want to advertise that they will do nothing to threaten the system. It’s not as if, unless you live in Portland or Seattle, there are organized groups of anarchists who want to break shit that you have to worry much about in these protests. It just seems to me, again, to be announcement that we aren’t threatening in any way, shape, or form, that we will hold our little action and go home (probably using public transportation) and we can be ignored.
Tragically, Los Angeles resident Brandon Sanders found his home had burned to the ground in the Eaton fire.
There was one good piece of news: His Tacoma had been parked far enough away from the burning structure that only the front of the vehicle was scorched. To his surprise, when he tried to start it "it fired right up," he writes. "Everything works, even the headlights and blinkers!"
Social media being social media, there are now posts going around claiming that Tacomas are fireproof. It should be obvious to the sane, but Sanders' experience with the truck was very good luck. In this other photo, here we see a house that was unscathed by the fire:
Note the unlucky SUV on the neighboring property that burned. Beneath it, you can see that aluminum has melted beneath the vehicle and flowed down the driveway. Aluminum melts at 1,221° Fahrenheit (660° Celsius). The Tacoma was not exposed to that temperature, or it would look like this SUV.
The unburned house, by the way, was designed by architect Greg Chasen. "Some of the design choices we made here helped," he writes. "But we were also very lucky."
If you're interested in what design decisions can harden a house against fire, in this comprehensive video—which has gone viral—homebuilder Matt Risinger analyzes two unburnt L.A. homes. One is the Chasen-designed house, and the other belongs to Tom Hanks:
In The Sandman, the DC comic-book series that ran from 1989 to 1996 and made Gaiman famous, he tells a story about a writer named Richard Madoc. After Madoc’s first book proves a success, he sits down to write his second and finds that he can’t come up with a single decent idea. This difficulty recedes after he accepts an unusual gift from an older author: a naked woman, of a kind, who has been kept locked in a room in his house for 60 years. She is Calliope, the youngest of the Nine Muses. Madoc rapes her, again and again, and his career blossoms in the most extraordinary way. A stylish young beauty tells him how much she loved his characterization of a strong female character, prompting him to remark, “Actually, I do tend to regard myself as a feminist writer.” His downfall comes only when the titular hero, the Sandman, also known as the Prince of Stories, frees Calliope from bondage. A being of boundless charisma and creativity, the Sandman rules the Dreaming, the realm we visit in our sleep, where “stories are spun.” Older and more powerful than the most powerful gods, he can reward us with exquisite delights or punish us with unending nightmares, depending on what he feels we deserve. To punish the rapist, the Sandman floods Madoc’s mind with such a wild torrent of ideas that he’s powerless to write them down, let alone profit from them.
As allegations of Gaiman’s sexual misconduct emerged this past summer, some observers noticed Gaiman and Madoc have certain things in common. Like Madoc, Gaiman has called himself a feminist. Like Madoc, Gaiman has racked up major awards (for Gaiman, awards in science fiction and fantasy as well as dozens of prizes for contemporary novels, short stories, poetry, television, and film, helping make him, according to several sources, a millionaire many times over). And like Madoc, Gaiman has come to be seen as a figure who transcended, and transformed, the genres in which he wrote: first comics, then fantasy and children’s literature. But for most of his career, readers identified him not with the rapist, who shows up in a single issue, but with the Sandman, the inexhaustible fountain of story.
I’m a late-comer to Gaiman (I only read American Gods last year and never really had the opportunity to become a fanatic), so this isn’t psychologically catastrophic for me in the way it is for some of his more dedicated fans. I also know a few people who are personally acquainted with Gaiman and pretty much all of them have indicated that they only find the revelations surprising in degree rather than in kind.
Where are we with respect to the artist and the work at the dawn of this post-woke age? In the future I don’t plan to avoid any Gaiman-related project because of Gaiman, but at the same time I don’t think I’ll want to read or watch anything specifically because it’s Gaiman. I make no judgment of how anyone else approaches; if digesting the work of a creep is too creepy for you, I’m in no position to tell you that you’re wrong and that you need to read American Gods or watch Sandman.
Photo credit: By Kyle Cassidy – By email, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=37378819
Somehow, this makes bishops appear the most peaceful as they comingle. But the pawns: put those vicious monsters in the farthest corner inside a double-walled enclosure. They eat everything!
I moved to Albuquerque 25 years ago this summer. Crazy. One of the first places I discovered there was Alpahville Video, your classic arty based movie rental store. It was good. They had all kinds of things. Of course, the rise of Netflix killed it a few years later and what has been better for cinema in the last quarter-century than that company…..Anyway, at the time they had VHS and DVD and I had both players. So I would do what the goddamn algorithms–a technology to create the worst and most boring version of yourself possible–can never do, which is allow you to browse and explore and pick things up and consider and then make a choice.
Well, one day, I was looking at their Asian section and I wanted something that wasn’t Japanese or from one of the relatively few arty Chinese directors whose films were available in the US. I picked up a VHS tape of a film called Turumba, directed by a Filipino guy named Kidhat Tahimik and released in 1981.
What I ended up watching was the best film about globalization I had ever seen or still have ever seen.
The story goes like this–there’s a family in a small town in the Philippines, probably not all that far from Manila, but far enough without a highway, which happens to be under construction. The time is 1970 or so. The family is really artistic. The father teaches music and is the lead cantor at the town’s religious festivals, called Turumba. The grandmother created a really advanced way to make the papier-mache toys popular in the town. Grandma is still around teaching the craft with an emphasis on craft. One day a German woman comes to town and sees the toys. As it turns out, she’s a scout for German manufacturers. She buys everything they have. Those things sell, she orders more, and pretty soon, she’s making big orders. Out goes the craftsmanship and in goes mass produced toys to commemorate the Munich Olympics in 72. Out goes spending time playing music and in goes long days in the nascent factory, not only for the kids (it’s told through the 10 year old or so son of the cantor/head of the household who becomes the factory owner) but for the kids of town. Out goes hanging out at night and in comes electric fans and TVs and cars. Out goes the joy of life and in comes the sadness of capitalism, a sadness that few actually want to reject because of the material upsides.
What makes the film so brilliant is the ambivalence. It isn’t romanticizing the people or place. They already exist in a globalized world. The kid loves his Batman t-shirt. It’s just starting at a given time–a time when globalization already is impacting a community in one way–and demonstrating what happens when that globalization goes into overdrive. It’s also not propaganda. It’s certainly a critique of neocolonialism, yes, but done the right way. The film really is about ambivalence. It’s funny. There are little asides that amuse. It’s filmed like a documentary but is not ham-fisted at all. It’s super cheaply made–the subtitles take up half the screen and Criterion Channel pretty clearly just did the best they could with a VHS copy since I am pretty sure this was never put on DVD. I would absolutely recommend watching this. I was amazed to see it show up and I was so happy. I watched it a couple of times, then the video store closed, and I hadn’t seen in 20 years. It was just as good as I remembered.
This leads to me two additional points. One is that for as wonderful as Criterion Channel is, it’s quite striking how even film buffs just want to watch 80s and 90s big budget films they remembered liking back in college. The monthly programming now is deeply skewed in that direction, with very little on foreign films. This month includes a Nicole Kidman retrospective, a collection of films called “Surveillance Cinema,” which is a way to organize The Truman Show and Minority Report and Gattaca into a respectable Criterion thing; and three Cameron Crowe directed films. There’s also a couple of collections around older Hollywood films, but it’s pretty clear that there really is no market for foreign films, even among cinephiles, in this country. I get that Criterion is responding to the market. The problem is that it is very hard to search for films otherwise unless you are looking for something specific. If you try, you can search by country and if you put in Philippines, a bunch of things come up, but you have to think of that yourself. If I hadn’t seen this film in the Recently Added category, I would probably have never found it.
The other thing rewatching Turumba made me consider is how villages become centers of a specific type of craft. I’ve been in Oaxaca for nearly two months. You might be familiar with the alebrijes that come from here, the fantastically artistic wooden animals. They are cool, I grant you. I have a few. The story of these here seems to be similar–something a few people did, then a British filmmaker brought some of these people abroad, they got popular, and now the economy of two entire towns is making these things. I very slightly know an anthropologist who has written a book on this and I guess I should read it. How do specific towns rearrange their economies to produce what were once crafts for a mass global market?
I would guess I don’t fly very much compared to a lot of the LGM community. I’ve probably averaged something like five flights a year in recent years. So I was taken aback last week when flying back from the holidays in Michigan to Colorado by the following experience.
I had the middle seat in a three-seat row. This is because I was too cheap to pay to “upgrade” to an aisle or window seat on the United Airlines flight. (I suppose this kind of thing is economically efficient, but I bet the constant nickel and diming on everything in this world of hypercapitalism is a big reason why everybody is in such a bad mood).
I had just sat down in awkward proximity to the two total strangers on each side of me, when the 30ish woman in the window seat said something to me about my elbow being on the armrest. I don’t remember her exact words, but I was, absurdly in retrospect, embarrassed and slightly flustered, and apologized for not knowing the relevant etiquette. I mean I don’t fly much, relatively speaking, but I’ve still taken hundreds of flights over the last 45 years or so, so I probably shouldn’t have immediately assumed I was in the wrong, but hey that’s how we ended up with January 6th I suppose.
Then this person did something so odd that I still can’t quite believe she did it. She showed me the screen of her phone, which featured a text to someone that read “I”m sitting next to Elbow Guy,” under a photo of my elbow on the armrest! This made me feel even more disconcerted by my apparent faux pas, although a little light went off somewhere in my mind, or in the back of my mind, that maybe this person was a little off her rocker, or “quirky” as we say in Boulder if the person’s net worth is at least eight figures.
Anyway, I later Did My Own Research ™ and discovered that I had a largely if not universally recognized right to BOTH armrests, which if I had done the math at the moment should have been deducible, since if I didn’t then the other two people in the row would each get two armrests, and, along with the privilege of not sitting in the middle, enjoy perfect armrest hegemony.
But the part of this story that still feels disconcerting was the texting of the photo of my elbow, along with my transformation into Elbow Guy. This felt somehow invasive of my privacy/space in some way related to larger issues with the information economy.
Middle gets both armrests. Window may lean on bulkhead (ymmv) and gets control of the shade Aisle - unquestionably the best seat - gets: 1) to move whenever anyone desires to get up 2) must pass drinks and food and trash 3) elbow hit by drink and trash carts as well as passing butts 4) smallest footwell 5) must stare at lap/book to avoid eye contact with other passengers movie/Fox News graphics of seats ahead of them
Nurse Harriet Curley takes the pulse of a Navajo patient waiting in the dispensary of the Sage Memorial Hospital, an ultra-modern institution at the Ganado Mission, Arizona, a Presbyterian enterprise deep in the Navajo Indian reservation, Dec. 14, 1949. (AP Photo)
A couple of years ago, I saw the superb Mali Obomsawin play with her jazz band. She is Abenaki. She started talking about land acknowledgements and called them “corny” before going on to say that the real land acknowledgment is knowing that her ancestor was imprisoned in Boston for practicing his religion. That got pretty well at the absurdity of land acknowledgements. What do they actually do? The answer is usually nothing. At first, one might argue they were useful in the sense of reminding folks that the land does have a Native history. But pretty quickly they became a way for whites to engage in performative liberalism without any kind of commitment and then they became a way for corporations and wealthy institutions to give lip service to something progressive while doing absolutely nothing for Native Americans here today, including people of the tribes being mentioned! This has bothered me for years now.
If you work at a university, large corporation or left-leaning nonprofit or have attended certain performances, you have probably heard a land acknowledgment, a ritual that asks you to remember that Native Americans were here long before the peoples of Europe, Africa and Asia. The New York City Commission on Human Rights, for example, on its website “acknowledges the land politically designated as New York City to be the homeland of the Lenape (Lenapehoking) who were violently displaced as a result of European settler colonialism over the course of 400 years.”
The point is to make us more aware of the dispossession and violence that occurred in the establishment and expansion of the United States. But they’ve begun to sound more like rote obligations, and Indigenous scholars tell me there can be tricky politics involved with naming who lived on what land and who their descendants are. Land acknowledgments might have outlived their usefulness.
Instead of performing an acknowledgment of Native peoples, institutions should establish credible relationships with existing Native nations. In the United States, there are 574 federally recognized tribes, plus many state-recognized tribes and communities that own and manage land, operate social services and administer federal programs, much as counties and states do. They run tribal businesses and make small-business loans to their citizens. They provide jobs and revenue that help drive regional and rural economies. What they need from universities, corporations, nonprofits and local and state governments is partnerships that acknowledge and build on their continuing sovereignty.
…
The Native Governance Center notes that land acknowledgments often “become an excuse for folks to feel good and move on with their lives.” The journalists Graeme Wood and Noah Smith have criticized them as “moral exhibitionism” and ethnonationalism. In an interview Keith Richotte Jr., the director of the University of Arizona’s Indigenous peoples law and policy program and a citizen of the Turtle Mountain Band of Chippewa Indians, told me that if land acknowledgments “are treated as the only or last step of one’s commitment to Indigenous peoples and nations, then they can become more harmful than beneficial.”
Land acknowledgments tend to reinforce the myth of Native disappearance and irrelevance. In calling attention to dispossession, they often miss the point that Native Americans survived and are having a renaissance in culture and sovereignty. The vanishing-Indian myth has deep roots in American history. As part of taking Indigenous land, 19th-century Americans found it useful to believe that Indians were fading away. They described precolonial North America as a wilderness — “occupied by a few savage hunters,” as President Andrew Jackson put it, who “were annihilated or have melted away to make room for the whites.” Jean O’Brien, a historian and citizen of the White Earth Ojibwe Nation, called it a “narrative of Indian extinction that has stubbornly remained in the consciousness and unconsciousness of Americans.”
Tribes are still here and have had to go to court to defend their remaining sovereignty and property, spending their revenue to buy back land that once was theirs. In 1996 the Eastern Band of Cherokee Indians bought back one of their sacred sites, the Kituwah mound, which once sat at the center of the Cherokee Mother Town, and the Osage Nation has saved the only ancient pyramid mound remaining in St. Louis by buying its summit. In its 2020 McGirt v. Oklahoma decision, the Supreme Court ruled that the treaty-defined boundaries of the Muscogee, Cherokee, Quapaw, Choctaw, Chickasaw and Seminole nations remain in full force because Congress never disestablished their reservations. Yet the State of Oklahoma has continued to fight tribal jurisdiction over criminal cases. If tribes didn’t have to spend revenue buying back land and defending their interests in court, they could use more of it on the health, education and criminal justice programs that benefit their citizens and their neighbors.
My colleague Amanda Cobb-Greetham, the founding director of the Chickasaw Cultural Center in Sulphur, Okla., and a citizen of the Chickasaw nation, told me that instead of lengthy discussions about whether and how to write land acknowledgments, institutions should engage in active and meaningful relationships with the Native nations that are now or were on the lands those institutions occupy. Florida State University and the Seminole Tribe of Florida have established such a relationship, which started with the tribe’s involvement in designing the mascot’s regalia but now extends to other partnerships, including creating a Native American and Indigenous Studies Center.
I know some of these people and respect them very much and I can’t agree more. Have your land acknowledgement if you want, but if you aren’t actively doing something within your power to remedy injustice today, then it’s totally worthless. If you are a university, are you offering free tuition and fees to the Tribes in your area? If you are a professor, are you assigning work by Native scholars or centering Native voices? If you are running a corporation, are you engaging in affirmative action plans for the Tribes? There lots of things we can be doing. But mostly, land acknowledgements exist to make whites feel good about themselves.
The new class of weight loss drugs also have miracle drug impacts it seems and given that both good health and the lack thereof have enormous economic structures build around them, their impact is likely to be quite dramatic.
Some junk-food companies and alcohol sellers are freaking out about the prospect of reduced appetites or booze cravings. As they should: The average household with at least one family member on a GLP-1 is spending about 6 percent less on groceries each month within six months of adoption. That translates to about a $416 reduction in food and drink purchases per household a year. Spending reductions are even greater for high-income households, according to a new study by researchers at Cornell University and Numerator.
Some categories have been hit harder than others. For example, these households are spending about 11 percent less on chips and other savory snacks and 9 percent less on sweet bakery items. Select healthier foods, such as fresh fruits and yogurts, have gotten a very tiny bump.
There are some potential retail winners. For example, rapid weight loss has encouraged some patients to replace their wardrobes. Theclothingrental company Rent the Runway recently reported that more customers are switching to smaller sizes than at any time in the past 15 years.
Airlines could save significant money on fuel if passengers slim down en masse, a financial firm projected. Life insurers could cash in, too, given the many mortality risks linked with chronic obesity. “Generally, running a life insurance company right before immortality is discovered — cancer vaccines, antiaging therapeutics — is a good business to be in!” said Zac Townsend, CEO of the life insurance company Meanwhile.
Nearly every GLP-1 user I’ve interviewed in the past year has also mentioned spending money on new hobbies, such as pottery classes or pickleball leagues. Some deliberately picked activitiesto replace social engagements that revolve around food or alcohol; others said they simply gained the energy and self-confidence to try new things.
“I am way more active than I have been,” said Mitchell, whom I interviewed for a recent PBS NewsHour story about Ozempic economics. “I took my daughters horseback riding on the beach last Christmas. We’ve been snow tubing, things that I would have never thought to do.”
OK, some of this seems anecdotal. However:
The Danish pharmaceutical company Novo Nordisk, maker of Ozempic and Wegovy, nearly single-handedly kept its home country’s economy out of recession last year while most of Europe struggled. And because Americans are the primary customers of these meds, U.S. dollars flowed heavily into Denmark, causingthe Danish krone to strengthen relative to other currencies.
To keep the krone’s value steady relative to the euro, the Danish central bank had to cut interest rates. Put another way: Overweight Americans unintentionally helped Danes get cheaper mortgages.
Wow.
In any case, good health is good! And it has good effects!
Obesity is a chronic disease associated with dozens of other ailments, including joint problems and cancers. So helping Americans lose weight has the potential to make the public much healthier — and reduce spending on other (costly) care.
Seven women in Mitchell’s family, for instance, had breast cancer and both of her parents developed forms of dementia. Mitchell herself developed diabetes, too. All of these problems have linkages with obesity. “I don’t want to be sick,” said Mitchell,explaining why she turned to Wegovy after previously trying diets, exercise, therapy and surgery. “After taking care of my parents, I said, ‘I don’t want my children to have to take care of me.’” Her obesity is now in remission and she no longer has diabetes.
Of course, such potential health benefits — and cost savings — will materialize more broadly only if patients keep up with their medications and adopt healthier habits to help maintain lower weights. Which is a big if.
Research suggests most patients who wereprescribed these meds stop taking them within a year. Some stop because they’ve successfully reached their goal weight. But many others report stopping because of costs, unpleasant side effects, drug shortages or squeamishness about needles.
Who knows what will happen with all of this. But it sure is one of the more fascinating things to come along in the last couple of years. I bet RFK will have thoughts……
Jimmy Carter has died. Carter was a pretty bad president and then one of the two greatest ex-presidents, along with John Quincy Adams. He’s become something of a beloved figure among liberals in the last twenty years or so, both because of his brave stance denouncing Israeli apartheid against Palestinians and because he lives his faith through Habitat and his other actions, with no sense of the hypocrisy so common among evangelicals. But still, Carter really sucked as president.
Born in 1924 in the small southwestern Georgia farming town of Plains, Carter grew up in the region’s small farming elite. His parents owned a lot of land and his father was a successful businessman. This gave the young man a lot of chances that even many Georgia whites did not have. Of course, his father was a staunch segregationist and they were the wealthiest family in a largely African-American area. But Jimmy went to the local public schools and succeeded there. Then he fulfilled his childhood dream of attending the Naval Academy in Annapolis. It took awhile for a boy from southwest Georgia to make this happen. First, he spent a year at Georgia Southwestern College in Americus and then a year at Georgia Tech in Atlanta. Finally, in 1943, he got accepted to the Naval Academy. He did well, graduating 60th in a class of 820 in 1946. With World War II just having finished, the expanding U.S. military presence around the world required a lot of officers and Carter would spend the next seven years at bases all over the place, both in the U.S. and being deployed in the Atlantic and Pacific fleets. He started his family at this time too, having married Rosalynn in 1946.
Carter became interested in submarines and eventually qualified for command of ships. In 1952, he started working in the Navy’s growing nuclear submarine program. He was based out of Schenectady but spent time at the Naval Reactors Branch of the Atomic Energy Commission in Washington as well. When a partial meltdown took place at Canada’s Chalk River Laboratories in 1952, U.S. experts went to help, including Carter. He was exposed to radiation while disassembling the reactor. He was in protective gear and didn’t suffer any negative health consequences, but this permanently affected his position on nuclear weapons and nuclear power.
In 1953, Carter started training to serve on a nuclear submarine. But he was tiring of military life and when his father died, Carter and Rosalynn chose to return home to Plains. At first, he and his family, now consisting of three small boys, lived in public housing in Plains, making him the only president to have lived in public housing. But he soon took over the family farm and his father’s peanut operations. Being a scientific man, he made this a going concern.
One thing that always drives me nuts about how conservatives talk about Carter is their dismissal of him as a “dumb peanut farmer.” Sure, and a Naval officer who worked on nuclear submarines.
One of the critical questions about Carter is how he more or less overcame the racism so central to his growing up. That he followed someone as utterly awful as Lester Maddox as governor makes his rise and career even more interesting. Carter’s early political career is one of racial moderation but not a lot of racial courage either. He mostly kept quiet about his belief that segregation should be abolished. That doesn’t mean his positions weren’t known. When the White Citizens’ Council approached him to join them, he refused and then they boycotted his peanut warehouse. There was room for moderation among whites on segregation and that’s where Carter firmly remained. Not every white was a WCC or KKK member, even if very very few took any real risks on promoting desegregation.
In 1961, Carter became the chairman of the Sumter County School Board and here he did vocally approve of integration. When he ran for state senate the following year, the Democratic Party machine wanted him to lose, so they fixed the election with the aid of the county sheriff. Carter challenged the result, the fraud was uncovered, and Carter won the next election.
But Americus was a place of racial violence. That was hitting a peak in the early 1960s and Carter basically did not speak out about it out of fear of alienating the segregationists he needed for his political career. This was the politics of the racial moderate. For an ambitious politician from southwestern Georgia, this was not unexpected. He certainly could have been a whole lot worse.
Carter was a very hard worker and very ambitious. These traits served him well. He rose rapidly within the Georgia Democratic Party, taking speed-reading courses so he could digest more material. He got a position on the state Democratic Executive Committee and became chairman of the West Georgia Central Planning and Development Commission, overseeing the distribution of state and federal grants. This made him regionally powerful and also put him in conflict with established interests who disliked the anti-corruption politics of the newcomer.
All of this was intended to set him up to run for governor in 1966. Carter was something of a late entry, but his political enemy, the Republican Bo Callaway, whom he had clashed with on the planning commission, ran on a pro-segregation platform. Democrats feared losing the state for the first time since Reconstruction and Carter decided to take him on. He ran as a moderate and came in third in a three-way primary, behind the loathsome violent racist Lester Maddox and the old New Deal liberal Ellis Arnall. Maddox won the run-off and then the general.
Carter was devastated—coming in third was not his plan and seeing Maddox take power was definitely not his plan. But he ran again in 1970, this time a more experienced party leader and a savvier politician. He managed, somehow, to court both the black vote and the segregationist vote. He met with Andrew Young and Martin Luther King, Sr. while also inviting George Wallace to come make a speech in Georgia. Overall, this was a more conservative campaign than four years earlier. He attacked his liberal primary opponent for being a northern-style progressive and, toward the end of his campaign, actually disseminated racist ads showing his opponent with black basketball players. Such was the reality of Georgia politics in 1970.
The moment he took office, Carter completely turned his back on the segregationists. They were angry. In his inaugural address, he said the time for racial discrimination was over. That was fine, but he wasn’t a particularly effective governor for reasons that repeated themselves in his presidency. He didn’t like working with the legislature, in no small part because he hated the glad-handling that required, which he associated with corruption. He also felt government was too big and while there may have been good reasons behind his goal of streamlining government, reducing departments, and placing greater power in the governor’s office, this would also serve some less than progressive ends in the White House.
On race, Carter appointed a lot of African-Americans to offices, the first governor of Georgia to do so since Reconstruction. On the other hand, he opposed busing as a strategy to integrate schools, co-sponsoring an anti-busing resolution with George Wallace at the National Governors Association annual meeting in 1971, and he supported the death penalty, which of course was disproportionally applied to black people.
Carter would also embrace really bad positions for political reasons. For example, when William Calley, architect of the My Lai Massacre that killed over 500 innocent Vietnamese, was convicted of his crimes, he led a statewide initiative that created something called American Fighting Man’s Day and had Georgians drive with their light on during the day for a week as a symbol of their support for the war criminal.
Carter seems like an unusual presidential candidate, or more accurately, an unusual person to actually win the nomination. He was always very ambitious. He tried to align with conservative forces at the national level so he could balance the McGovern ticket and become the VP candidate in 1972. That obviously did not work. He did the work to raise his profile, but it was still low. In 1973, Carter appeared on What’s My Line, where the panelists had to guess his occupation. It took a long time before Gene Shalit (who still lives!) figured it out. Carter was just a medium-sized state first term governor with no national profile. That was not going to stop him.
Carter announced his presidential candidacy in December 1974. No one cared. By January 1976, he had just 4 percent support among Democrats in polling. But, with overall disgust at Washington after Watergate, Carter managed to rise fast in early 1976. He won in both Iowa and New Hampshire. He was the moral outsider moderate, not Nixon, not Wallace, and not McGovern. It worked. The darkhorse won the candidacy, naming the liberal Walter Mondale as his vice-president. He had a big lead early in the general, but Ford nearly came back to win; in fact, Ford won more states. But Carter became president, the most unlikely person to win a presidential election since Warren Harding.
Unfortunately, Carter really sucked at being president.
The problem with Carter’s presidency is that he was bad at the job. Really, he was bad at it in many ways. The ultimate micromanager, he could get distracted with trivia. His distrust of established politicians meant that he found himself surrounded by economic advisors who told him repeatedly to triangulate between the parties, alienating everyone. He had opportunities to change the nation’s trajectory by passing groundbreaking legislation with large congressional majorities, especially in his first two years, but he just wouldn’t do it. His moralistic take on the world had some value in a post-Nixon era, but also blinded him to the complexity of many problems and the kind of deals one had to make in order to succeed.
Simply put, Carter is as close to a libertarian as we have ever had as president. That’s a tough one for us to swallow perhaps. We may want to see him as a great liberal. But he wasn’t a liberal at all, especially not on economic issues. He truly believed that nation needed to move on from the New Deal state. He distrusted government programs to help the poor. Although Congress was filled with liberals, he surrounded himself with the new neoliberals who told him repeatedly that inflation mattered much more than either building an effective political coalition or taking brave stances to use his power to create a more equal world. He loved deregulation and repealed many of the protections for consumers that had come into the law over the few decades before this. He fought for lower taxes over economic stimulus, consistently undermining his own Democratic Congress. Moreover, he was a true believer in all of this stuff. It wasn’t political expediency, which you might understand. No, he had a vision for the economy was antithetical to contemporary liberalism. Just because he was a good man personally and a Democrat and a great ex-president does not make any of this untrue.
Now, to be fair, these were tough times. The corporate lobby was now well-organized and seeking to roll back the labor and environmental and consumer regulations Americans had passed in the previous few decades and especially the last ten years. The real impact of that was in the future, but looking back, it was clear where this was headed. The economy was really tough. The nation had not dealt well with the oil crisis and the Vietnam War.
Inflation was a very real problem. Already an issue when he took office, the OPEC oil boycott meant that inflation jumped from 5.8% in 1976 to 13.2% in 1980. That would have caused massive problems for any president. Capital mobility and deindustrialization were beginning to sweep the nation and no one had any answers for communities such as Youngstown. That city’s famed Black Monday, where the first of the big steel plants shut down, took place in the first months of the Carter administration. What to do about Youngstown and other places would be a big theme of this administration. But Carter’s own reticence to take aggressive action on the economy and deindustrialization continues to reverberate today. Basically, Carter didn’t care much about cities like this and effectively offered them nothing to ease their burden.
But even outside of the big issues of the time, Carter wasn’t very good at being president. He always had a more than a bit of the anti-politics politics that drives ideas today like jungle primaries and his lack of attention to partisanship meant that he blew many easy chances to create positive legislation. His micromanaging was legendary and took him away from the things he needed to be focusing on. He started his administration by taking on some western water projects that he thought was pork, which was probably fine in theory, but this is what he chose to spend his first political capital on and all it did was infuriate leading members of Congress from both parties who benefitted from them. As they say in the West, whiskey is for drinking and water is for fighting. If Carter had bothered talking to western politicians about this, he would have known better but that was not his style. Instead, he just made enemies of people who led powerful congressional committees and who started looking at the westerner Ronald Reagan shortly after.
None of this is to say Carter didn’t do some good things. His second day in office, Carter pardoned all Vietnam draft evaders, a clearly morally correct policy and one over which he took a political hit. Carter was also the greatest environmental president we have ever had. It’s worth noting what a missed opportunity the nation had to get serious about its environmental problems and move forward into a clean energy future that could have significantly mitigated the impact of climate change.
Nothing says more about these missed opportunities than Carter having solar panels placed on the roof of the White House and Reagan then having them removed. Carter taking on the energy crisis like it was a war was a great policy, but it’s also not one Americans wanted to hear. Americans want their president to kick some ass. Telling them to turn down their heat and put on a sweater is more or less the opposite of that. Creating the Department of Education was probably a good thing, even though the position is one of the weakest in the Cabinet, which at a time when Linda McMahon is about to lead it is probably a good thing, even if local control over education is in the end creates a lot of problems.
Much of what Carter faced was an era where he was pretty clear-minded about America’s limitations, governing a nation angry at having those limitations exposed. Americans wanted to drive huge gas-guzzling vehicles, not having gas rationing plans, which Carter presented to Congress in 1979. His famous “malaise” speech from later that same year, based on our energy issues, but talking about the overall position of the United States at that time, was widely attacked. One might argue that Carter simply lacked the political skills to be an effective president. Terrible at messaging these issues and too honest for a cynical media, he struggled to connect with Americans. Perhaps a different politician could have taken on these issues more effectively, but we will ever know. In any case, he had so alienated Congress by that point that in May 1979, the House voted against giving Carter the authority to create a gas rationing plan; Carter responded by calling the vote “embarrassing” which did not help him mend those needed relations.
On other environmental issues, Carter was really great. His choice to lead the Occupational Safety and Health Administration, Eula Bingham, was outstanding and this was the only administration in which OSHA was really moving toward the activist force it could be. His 1978 declaration of a federal emergency at Love Canal and the Superfund program that followed was brilliant.
On public lands, Carter was outstanding. His most important action was signing the Alaska National Interest Lands Conservation Act, providing protection to 157 million acres of land, 43 million of which were in national parks, the creation of two national monuments, and over 12 million additional acres designated as wilderness, a process that followed Carter using the Antiquities Act to protect 56 million acres as national monuments in 1978, which led him to be burned in effigy in Fairbanks.
On some foreign policy issues, Carter deserves a good bit of credit. The negotiations that led to the SALT II treaty with the Soviets, fixing nuclear missile counts and limiting new development was a very positive step toward peace in the Cold War. The Ford administration had started this process, but Carter is who nailed it down with Brezhnev. The agreement was signed in 1979 and it seemed that Soviet-American relations were improving. But with the Soviet invasion of Afghanistan shortly after, relations suffered badly and the Senate refused to ratify SALT II.
On the other hand, Carter, bringing his moralism into foreign policy, decided to boycott the 1980 Olympics. This was a shame. The only people who suffered in this boycott was the athletes who had trained for the Olympics their whole life. The Soviet response to boycott in 1984 meant that you had back-to-back games tainted with political posturing. Carter redoubled efforts to influence Pakistan after the Soviet invasion of Afghanistan, which might make sense from a geopolitical perspective but the Muhammad Zia-ul-Haq regime was pretty awful. No good choices there. But a lot of that funding went to Islamist resistance groups, which did not exactly end well for the U.S. or for the citizens of Pakistan and Afghanistan. Still, Carter always defended this decision.
Fundamentally, Carter bringing human rights and moralism into foreign policy was simply hard to do, especially in the aftermath of Kissinger. It was a welcome change, but it was also applied with massive inconsistency. That’s what we see in Asia, Africa, and Latin America. Carter wanted to use U.S. influence for a free Rhodesia and to end apartheid in South Africa, but he faced too many big obstacles, including growing fascism among white South Africans, the election of Margaret Thatcher (and Britain was more influential in South Africa anyway), and the fact that conservatives in Congress such as Jesse Helms liked apartheid. Carter was mostly good, but then he would say nothing against the horrors of Ferdinand Marcos in the Philippines because they were too important an ally. These inconsistencies were noted at the time.
Carter faced the Latin American dictatorships and responded with at least some level of disapproval, telling the Argentine junta to quit throwing people out of airplanes. He didn’t respond well to the rise of the Sandinistas in Nicaragua, but at least he didn’t instantly move toward murderous militarism in the same way as Reagan would. He also worked out the treaty with Panama to return the Panama Canal to that nation in 1999. That undermined an increasingly problematic site of protest by anti-colonial forces that reminded the world of American imperialism. It also led to a massive right-wing backlash, becoming the culture war issue of the 1980 election and leading to the defeat of several long-time politicians who supported it, such as Frank Church in Idaho. This is one of those issues that is almost impossible to wrap your head around today—why did returning the Panama Canal to Panama cause such a widespread reaction. But that’s OK, in 50 years, people will wonder the same thing about Critical Race Theory and the modern Republican Party. Amazingly, we are now talking about taking over the Panama Canal today because our next president is not only a zillion years old, but is partly a response to Carterism anyway.
Of course, Carter’s signature achievement was his work toward peace in the Middle East. The Camp David Accords did not in fact bring peace to the region, but Israel and Egypt have more or less gotten along ever since and that reduced overall tensions tremendously. Unfortunately, the assassination of Anwar El-Sadat and the rise of Hosni Mubarak placed sharp limitations on Egyptian governance and the recent history of Israel is of a nation turning far to the right. But getting those two nations to sit down and work out an agreement was an actual foreign policy achievement far greater than nearly any president has had in foreign policy.
It’s hard to say much positive at all about Carter’s response to the Iran hostage crisis. To be fair, there weren’t a lot of great cards to play. But the rescue operation was an unmitigated disaster and Carter deserved the blame he received for it. Early on, I don’t think you can criticize Carter too harshly. He announced sanctions and proclaimed he would not order a military action that would “cause bloodshed.” But as his 1980 reelection campaign struggled against the rise of Ronald Reagan, Carter ordered an invasion of Iran to rescue the hostages. Operation Eagle Claw was one of the worst disasters in American foreign policy history. First, Secretary of State Cyrus Vance explicitly told Carter that this was a terrible idea, but Zbigniew Brzezinski was a more influential advisor—and he wanted a military solution. The actual operation of the mission was a complete disaster, with botched preparations, the waste of fuel, and then desert sand blasted into the refueling tanker, leading to two planes going down in the Iranian desert. Vance resigned in disgust. This both showed American incompetence and gave the Iranian regime endless propaganda with their own people and globally. Coming on the heels of Vietnam and the oil crisis, this was a huge blow to American prestige and self-confidence.
Of course, the media also savaged Carter in horrendously unfair ways. He was seen by the Beltway elite as a redneck outsider who didn’t share their values or interests. His infamous Playboy interview when he said, “I’ve looked on a lot of women with lust. I’ve committed adultery in my heart many times” led to a lot of chortling among the media. His evangelicalism was funny to them. Nothing sums up how awful Carter fared with the Beltway hacks that make up our mainstream media then and now than the infamous rabbit incident. Carter was a man with a farm in rural Georgia. He was used to dealing with animals and knew how to handle a rabbit. But these Beltway hacks found it hilarious. What a buffoon, that Carter! He had a difficult brother that was a media joke. Good thing the media handles problematic family members of Democratic presidents reasonably today! He also had his young daughter Amy in the White House with him and she received way too much media attention for a girl that age. I don’t have much positive to say about how the media has evolved over time, but largely leaving the underage children of presidents alone is a good thing, even if you are unfortunate enough to have Donald Trump as your father. Presidential children of age who are trying to implement fascism, well, that’s another thing entirely.
But it was really on the economy that Carter’s presidency floundered. A believer that inflation was caused by monopolies, he believed strongly in deregulation. Carter’s emphasis on deregulation wasn’t entirely out of order; after all, he did help create the modern microbrewery movement through it. And while the Airline Deregulation Act did usher in an era of cheap airfare, it also laid the groundwork for the almost comically terrible experience of flying today. But overall, his emphasis on deregulation as opposed to better regulation played no small role in the neoliberal era that has snowballed into the all-out war on the regulatory state today. He had no vision for fighting inflation except for softer versions of the pro-corporate policies that Reagan would later pursue, while not offering any sort of message that Regan would be so effective at. But his austerity programs and desire to cut social programs led to a lot of disturbance among Democratic leaders. As Tip O’Neill said in 1979, “Can you reelect the president on austerity?” The answer was no.
Perhaps nowhere did Carter show his massive limitations of imagination and governance than on the Humphrey-Hawkins Act. That bill, which would have created true full employment, initially with the right to sue the government if you could not find a job, was an attempt to rebuild the liberal coalition by appealing to the best of the New Deal job programs and to the black community, which suffered so badly from underemployment. The bill’s sponsors thought Carter was on board with them during the 76 campaign, but they were wrong.
Carter surrounded himself with neoliberal economic advisors who prioritized inflation over every other goal and governed significantly to the right of his quite liberal Congress, infuriating the left. Carter sent Charles Schultze, his chairman of the Council of Economic Advisors, to torpedo Humphrey-Hawkins. Schultze effectively shut down any provision that would actually create full employment or commit the government to putting money into job creation. The final bill, a mere shell of the original, committed the government more to fighting inflation than to helping the poor.
What happened with Humphrey-Hawkins repeated itself over and over during the Carter administration, infuriating unions and the left. As Jefferson Cowie tells in his great book Stayin’ Alive: The 1970s and the Last Days of the Working Class, near the end of Carter’s administration, a journalist asked International Association of Machinists president Wimpy Winpisinger how Carter could revive his reputation in the labor movement. Wimpy answered, “Die.”
Of course, Winpsinger didn’t want Carter to die, but that’s how far Carter’s reputation had fallen on the labor left by 1980. Carter’s response to the decline of the steel industry was almost nonexistent. He was more focused on propping up foreign steel suppliers to fight inflation than worried about the jobs lost in the U.S. The Japanese and European steel companies dumped their extra supplies on the U.S. market, undermining the ability of American firms to compete in their home markets. This undermined his own base voters in critical northern states and, taken aback by the closures, he simply struggled to even articulate a coherent response and nothing that happened undermined his belief that growing imports would only help the U.S., both in domestic and foreign policy. Carter shrugging his shoulders at Youngstown and other sites of deindustrialization infuriated working class voters.
In short, Carter had the congressional majorities to rebuild the New Deal coalition. Instead, he pandered to conservatives on economics, defied his own congressional caucus, and proceeded to fail entirely in stopping the Republican Party in 1980. I don’t know if reestablishing New Deal politics would have stopped the rise of Reagan and the right, but it couldn’t have ended worse than Carter’s actual policies did. Again and again, Carter sent bills to Congress that no one liked. His alienated his own party, Republicans didn’t support the bills either, and he simply would not build political coalitions to help himself out. You can’t help those who won’t help themselves and Jimmy Carter would not help himself.
By 1980, Carter was heavily damaged goods. The growing right certainly wasn’t going to vote for him over Reagan. Yet, he had strongly alienated his fellow Democrats, both in Congress and the base, which was still pretty strongly union-based at this time. His racial moderation wasn’t going to appeal to southern whites and he lost the chance to really lock in high participation from African-American communities by his economic policies that did not take the fight against poverty seriously. It’s fair to say that Ted Kennedy’s primary run against Carter was stupid and just hurt the president, but then Carter had pretty much asked for a liberal challenger. In fact, I don’t really have a problem with Kennedy deciding to challenge Carter, but Kennedy himself ran an awful campaign, so he damaged Carter without actually beating him, the worst of both worlds. Who knows if Kennedy would have defeated Reagan, but this was still the pre-serious part of his career, so I am skeptical.
In the general, Carter actually started out ahead of Reagan in polling, but by the fall, it was clear that Reagan was going to win. He did, going away. Carter only won his home state of Georgia, Minnesota, West Virginia, Maryland, Rhode Island, and DC. The electoral college total was an awful 489-49. Reagan won 50.1% of the popular vote to 41% for Carter and 6.6% to John Anderson. What a disaster.
After Carter lost, he had every right to be bitter and withdraw into private life. In 1981, there wasn’t too much of a precedent for what ex-presidents did, by which I mean that they did all sorts of things. Some were old and died soon after. LBJ went back to Texas, relaxed a bit, and started teaching at the University of Texas occasionally. Eisenhower golfed. Hoover stewed in endless bitterness at FDR. Nixon sought to rehab his reputation.
Carter chose none of these paths. Rather, he went into a life of public service unprecedented among an ex-president since John Quincy Adams. This started with the founding of the Carter Center in 1982, which I think is the first serious foundation founded by an ex-president. Carter made the most of it, fighting for worldwide democracy and human health.
Carter is most famous for his work with Habitat for Humanity, which seems like such an institution now that one forgets how central Carter was to its growth. He and Rosalynn started working with Habitat on a 1984 project in Americus, near Plains. Soon after, he led his own Habitat group to New York and a long collaboration had begun. Now, while we can that this sort of voluntarism has a downside because the government should be taking care of housing for the poor, of course the government is very much not doing that. Carter, believing in living his faith, helped spur a new path of voluntarism and this was a tremendously positive thing.
Carter’s work on tropical diseases is even more important. It’s hard to state just how horrific diseases such as Guinea worm and river blindness are. In 1986, the Carter Center decided to take on Guinea worm. That year, 3.5 million people suffered from the disease, spread through 21 countries. Today, it is almost completely eliminated. This is how you do a post-presidency. Carter long said he wanted to outlive Guinea worm. He may not quite have done so, and it could come back without continued vigilance, but what an amazing accomplishment. Moreover, through the whole thing, although Carter has no small ego himself, he handled himself with such grace and class and modesty. Bill Clinton, who always had a complicated relationship with Carter, could have learned more than a few things from the man about personal behavior, both during and after his presidency.
Carter also continued to take brave and bold stances on the issues he most cared about. He had no reason to take controversial positions. But he felt it was the right thing to do. That was especially true with his advocacy to peace in the Middle East. His continued efforts for peace in the Middle East were always incredibly noble, if out of fashion with an Israel no longer interested in a two-state solution or peace with the Palestinians. He spent a decent amount of time in North Korea, talking to that nation about giving up their nuclear program and in 1994, persuaded Kim Il-Sung to agree to a freeze, although of course that didn’t last. His election monitoring in politically troubled countries, particularly in Africa, was crucial work as well. In 2002, he received the Nobel Peace Prize for his efforts. Unlike Barack Obama’s Nobel Prize for not being George W. Bush, Carter truly earned his Nobel.
Up to the end, Carter spoke out. He criticized Trump for ending the Iran nuclear deal, a true foreign policy disaster, showed up around the nation for various events, often revolving around the Carter Center, and bemoaned the state of the government. He was a moral voice. He wasn’t a good president, not by a long shot, and recent efforts to revive his reputation aren’t very convincing. He was outstanding in some areas, but the number of unforced errors severely undermined him. But he was absolutely a good man. We will all miss him. But definitely not for his presidency, which was bad.
Well this is something special, a holiday treat for the end of 2024: a group of archivists (including Chris Person) has uploaded an HBO magic special by Ricky Jay that has been largely unavailable since it aired in 1996.
This is an RF rip of Ricky Jay and His 52 Assistants, to date the greatest card magic special ever produced, directed by David Mamet of all people. This special was produced by HBO and to date has never had a home release, although poor home recordings of this special exist online.
Before getting into preservation generally, it’s worth considering how we got here. Why is so much media lost or badly preserved? A recurring reason is that the people in charge are sometimes, but not always, asleep at the wheel. Media is forgotten or stored improperly, and humidity and heat have destroyed more of our history than we will ever know. Sometimes companies handle the material sloppily (I’ve blogged about the use of AI before, but there are countless examples in audio too).
Having shared all that, I feel like the quality of this YouTube video of the special is not perceptibly worse than the one uploaded to archive.org? What am I missing?
The playwright David Mamet and the theatre director Gregory Mosher affirm that some years ago, late one night in the bar of the Ritz-Carlton Hotel in Chicago, this happened:
Ricky Jay, who is perhaps the most gifted sleight-of-hand artist alive, was performing magic with a deck of cards. Also present was a friend of Mamet and Mosher’s named Christ Nogulich, the director of food and beverage at the hotel. After twenty minutes of disbelief-suspending manipulations, Jay spread the deck face up on the bar counter and asked Nogulich to concentrate on a specific card but not to reveal it. Jay then assembled the deck face down, shuffled, cut it into two piles, and asked Nogulich to point to one of the piles and name his card.
“Three of clubs,” Nogulich said, and he was then instructed to turn over the top card.
He turned over the three of clubs.
Mosher, in what could be interpreted as a passive-aggressive act, quietly announced, “Ricky, you know, I also concentrated on a card.”
After an interval of silence, Jay said, “That’s interesting, Gregory, but I only do this for one person at a time.”
Mosher persisted: “Well, Ricky, I really was thinking of a card.”
Jay paused, frowned, stared at Mosher, and said, “This is a distinct change of procedure.” A longer pause. “All right — what was the card?”
“Two of spades.”
Jay nodded, and gestured toward the other pile, and Mosher turned over its top card.