As March marches along (pun intended) we must now turn to the pressing issue of this time of year: the dog poop that’s been lying latent on/in the snow since January. Oh, boy…
The construction of HMS Dreadnought in 1905 was said to have triggered the naval race which drove WWI. While the nature of the historical record makes such claims unknowable–and a matter of opinion–Dreadnought did mark the beginning of the end of surface warship development. First Lord of the Admiralty John (Jackie) A. Fisher’s “all big gun” innovation drove warships to drop their multiplicity of ordnance in favor of a single primary batter and a host of secondaries. It also made them horribly more expensive.
Warships and the facilities to keep them have always been and will always be an expensive method of national defense, but in many cases a necessity. The ships themselves are only the most visible symbols. The dockyards, storehouses, training centers, munitions factories and everything else needed to maintain the ships cost orders of magnitude more than the ships.
But Dreadnought served in a particularly expensive, volatile and innovative period. Fisher felt that a warship’s first duty was to sink other warships. For this reason, he felt that scrimping on main gun armament in favor of smaller guns was a waste of space. Dreadnought carried ten 12-inch main guns in five turrets compared to the Lord Nelson class’s two 12-inch guns. To serve these guns, she was one of the first vessels in the Royal Navy to be built with electrically-operated centralized fire control. This large number of big guns were incentive enough to drive all other major combatants to follow the big-gun philosophy. While building her wasn’t particularly expensive for the time, designing and building entire navies because of that one vessel was–and that’s what happened.
For all the innovation she drove and all the sensation she caused at the time, Dreadnought’s combat record was quite brief–in fact, she never fired a shot at an enemy vessel. Dreadnought was, however, the only battleship to purposefully sink an enemy submarine. On 18 March 1915, German submarine SM U-29 broke the surface immediately ahead of Dreadnought and Dreadnought cut the submarine in two. She spent much of WWI being refitted and repaired, was paid off in 1920 and scrapped. Very little of the ship that drove a hundred others remains.
Saint Patrick’s Day 1984/2019
Yesterday was St. Patrick’s Day, commemorated with a parade first in Montreal in 1824 and observed in Canada as far back as 1759. The saint himself was said to have been born in Britain in the 4th century, and who returned to Ireland in the 5th to spread Christianity. He didn’t drive the snakes out: there were never any there.
But St Patrick’s day is noisily celebrated nearly everywhere, from Dublin to Yokohama to the International Space Station, primarily as a pop culture celebration and a reason to get blasted. Having an Irish heritage (my first ancestor in the New World was transported from Ireland to Jamestown in 1611) I can recall doing this more than once after I turned 18, and I can recall more than one St Patrick’s Day Blizzard growing up in Michigan.
On 17 March 1984, however, this scrivener and his bride Evelyne tied the knot in Waukesha, Wisconsin (see above). It was a sort of a compromise date. My step-brother was dying of cancer in Detroit at the time, and my step-father and my mother were shuttling back and forth between Michigan and where they lived in Florida, so I wanted to catch him on an up-cycle, and the date that became convenient was 17 March, a Saturday. It didn’t snow much that day, but it has snowed often enough on St Patrick’s Day since to make each anniversary memorable. And we’ve spent all of them together.
But five years ago today, on 18 March 2013, I had my C-3 through C-7 vertebra fused together. Didn’t snow that day, either, but it snowed a week later. I was in a brace and couldn’t do anything about it…but there it was.
So yesterday was our 35th wedding anniversary. Happy day, honey. I know you won’t read this, but I’d do it again, over and over. Love you!
In the late winter of 1917, a handful of Indo-Chinese laborers crossing the US on their way to France stopped in Haskell County, Kansas for at least three days. They had limited contact with any local Americans, and what contacts there were, were casual at most. On the morning of 11 March 1918, Albert Martin Gitchell reported to sick call at Camp Funston, Kansas (on the site of the modern Fort Reilly). He complained of a high fever, aches and pains, and a cough. Usually, this would have meant isolation in a sick ward (which was done), but Gitchell was a cook, who had been serving food as late as the night before. By noon there were 107 influenza patients; in a week, over 500. By April, over 1,000. Even this would have been unremarkable if 46 hadn’t died–horribly, coughing lung matter out in their final moments, blue-in-the-face.
What no one appreciated just then was that this H1N5 strain of influenza (so-called for the proteins in the outer shell of the virus) might have started as early as 1916 in Britain or France–to this day it is unclear. There were no centralized reporting mechanisms then, no CDC or WHO that anyone could recognize as such. Modern researchers believe that this strain of influenza may have been a close genetic match to the 1898 influenza, a milder form that swept the globe starting in China (as the flu always does) from October 1898 to March of 1899–flu season. It seems likely that the Vietnamese laborers carried the virulent Asian bug into Kansas, where it crossed with another strain, though the truth is unknowable.
What was remarkable about it wasn’t the “knock-me-down fever” that the flu was called, but that so many (proportionately) died, and not the elderly, infirm or very young who were usually flu fatalities. These were young people, healthy and in a prime state. Four deaths would have been odd, but 46 such horrible deaths was downright alarming. But the war came first, and the survivors–many still weak from the experience–shipped out in April and May for France, on slow-moving trains that stopped a dozen times before they reached whatever port they departed from–and spread the flu as they went.
At the time, medicine was in a state of transition. The only widely accepted vaccine was for smallpox; there were no antibiotics; there were still physicians whose medical training took about four months and did not involve looking at a cadaver. This bug spread from town to town, state to state, country to country. By June it had spread to most of the ports of debarkation and exploded worldwide. The Wilson administration was aware of the pandemic but forbade widespread news coverage of it because it would have been bad for morale. The British and French, Italians and everyone else had similar reasons for not covering it as the bodies stacked up in the morgues, ships arrived in port with bloody flux all over the decks and dead in the hundreds. For this reason, the only major European power that covered this plague in their mass media–the newspapers–was Spain, and that’s how it came to be known as the Spanish Flu.
This flu hit the sufferers suddenly and often violently. Caregivers came to know which sufferers were going to survive and which would not within the first few hours the symptoms presented. Extreme sufferers (about 20%) turned blue, cracked their ribs coughing, spewed black fluids from their mouth and nose, and died in hours..sometimes minutes. There was no treatment save codeine for their cough, and it hit those between the ages of 20 and 40 the hardest. Post-mortems showed the extreme sufferers were drowned in the detritus of their own immune systems that attacked the invading virus so vigorously that it killed their hosts. In milder forms, the affected simply weakened and died (40%) within days. The mildly afflicted–the lucky or strong 40%–suffered from a malaise that often lasted for years, sometimes for the rest of their lives.
By the second week in November 1918–when the War to End All Wars was ending–leading clinicians in the US and Britain, Russia (where the Revolution came to a brief halt) and even Japan were calculating the end of the human race. Most gave humanity perhaps six months to live. Many believed it had to be a new plague…a resurrected, reconstituted Black Death.
The American Army’s fatality roles 1917-18 were doubled by influenza. Large cities like Detroit and Chicago, Paris and London monitored traffic and imposed quarantines; rural communities and isolated islands stopped traffic altogether, frequently at gunpoint. A streetcar in Johannesburg loaded with passengers and departed a stop and five blocks later unloaded all of the 21 passengers and the conductor–dead. Children deprived of their caregivers starved to death, especially in urban areas. Funeral directors ran out of coffins and embalming fluids, which combined with shortages of gravediggers resulted in mass cremations: one in Vienna, Austria was said to have contained over 10,000 dead. Entire North African and Chinese villages were burned. Actuaries in the United States dropped the average life expectancy for 1918 from 55 to 37.
By the end of November, the rate of infection slowed, and by the end of January 1919, it became clear that the crisis had passed. It came again that winter, and once more in the winter of 1920-21, but the virulence seemed decreased, and the number of fatalities far less. Nearly 100 million people worldwide were killed directly or indirectly by the 1918 influenza; one in four (about a billion) were affected one way or another–sickened and survived like my mother’s father, overworked and weakened like my father’s father, or watched whole populations wiped out like my father’s uncle. In closing:
There are no lab samples, despite years of searching in graves: thus, there are no specific vaccines against the 1918 influenza.
Since the 1918 bug struck those in the prime of life–those who make vaccines–it is not clear that one could be made available if it should strike again.
The failure rate of the annual flu vaccine is about 30%; in bad years, like 2017-18, it rises as high as 60%. However, even failed or non-specific vaccines decrease the symptoms and the likelihood of retransmission.
Herd immunity is best sustained when 92% or more of any given population has been vaccinated, even with a non-specific vaccine.
The “reaction” to the flu shot shows that it is not only working but that the sufferer has already been exposed and is likely contagious.
About 20% of adults do not get regular flu shots.
Got your flu shot yet? Why not?
Thursday is Pi Day–3.14. It started out in 1988 as a celebration of mathematics by Larry Shaw of the San Francisco Exploratorium. The US Congress passed a non-binding resolution in 2009 recognizing 14 March as Pi Day. Nominally, this clever holiday has been celebrated or observed by throwing pies, holding mathematical symposiums, eating pizza and other more or less benign activities.
However, like many other things, Pi Day has been hijacked by…other interests. In 2005, an Oregon State physics major named Bobby Henderson sent an open letter to the Kansas State Board of Education, which was then struggling with creationism and intelligent design requirements alongside more scientifically accepted versions of Earth’s origins. He suggested that it is as likely that a Flying Spaghetti Monster created everything as it was any other deity. The most significant phrase reads:
I don’t have a problem with religion. What I have a problem with is religion posing as science. If there is a god and he’s intelligent, then I would guess he has a sense of humor.
Not exactly 97 Theses nailed to a door, but in the 21st century, it was enough. Soon, the Church of the Flying Spaghetti Monster–the Pastafarians–was born. A book entitled The Gospel of the Flying Spaghetti Monster was released in 2006. There are websites, and more books, and the odd, odd convention, and somehow piracy and other odd things got tossed in the chaotic mix. Mostly the Pastafarians are polking fun at organized religion, especially when it pretends to circumvent falsifiability.
Oh, and Pi Day is celebrated by some Pastafarian sects as recognition of a related deity. It is observed by reverently eating pizza…at least, according to my late buddy Bill, may Pasta rest his soul.
March…the month that deceives. It’s supposed to be coming up to spring, but here in the Great Lakes we can expect at least one more big snowstorm. We’ll know when it gets here.
4 March was an important day in American history for over a century. The 2nd Congress decided, under the Articles of Confederation, that the Constitution would take effect on 4 March 1789, when Washington was to be sworn in as President. But the electoral votes couldn’t be counted by then, so his inauguration was put off to 5 April. Thereafter, every routine presidential inauguration was held on 4 March except when it was on a Sunday in 1821, 1849, 1877 and 1917. The tradition ended with Amendment XX in 1933, which fixed the inauguration on 20 January.
This was less because of presidents than it was because of Congress. The Constitution states that Congress should meet on the first Monday in December each year, principally so that they would be available to decide who the president may be in the event of an Electoral College tie. 4 March was also the last day of Congressional business. Thus, the “lame duck” Congress was four months long…too long if control of Congress was to change, and those vengeful “other guys” wanted to change things.
Thomas Jefferson’s first inaugural in 1801 was the first held in Washington, DC. James Monroe’s 1817 inauguration was at the Old Brick Capitol in Washington because the British had burned the Capitol down in 1813, and restoration was underway. Andrew Jackson’s inauguration in 1829 was marked by drunken revelry but was the first of 35 held on the east front of the Capitol. Abraham Lincoln’s first inaugural was the first performed under armed guard. A blizzard forced William H. Taft’s 1909 inauguration into the Senate Chamber. Warren G. Harding in 1921 was the first to ride in a car to and from the ceremony. Franklin D. Roosevelt’s fourth inauguration in 1945 was entirely without fanfare: the exhausted president had less than four months to live. Jimmy Carter’s inauguration marked the first “march” from the Capitol to the White House–a hike of about a mile. Since Ronald Reagan in 1981, the ceremonies have been held on the Capitol’s west front, a move designed to both cut costs and to provide more space for spectators. There have also been milestones in communications:
Thomas Jefferson, 1801: the first covered by a newspaper extra of an inaugural address
James K. Polk, 1845: the first covered by telegraph; first known newspaper illustration of a presidential inauguration
James Buchanan, 1857: the first to be photographed
William McKinley, 1897: the first to be recorded on film
Theodore Roosevelt, 1905: the first time that telephones were installed on the Capitol Grounds for an inauguration
Calvin Coolidge, 1925: the first to be broadcast nationally by radio
Herbert Hoover, 1929: the first recorded by a talking newsreel
Harry S. Truman, 1949: the first to be televised
John F. Kennedy, 1961: the first to be televised in color
Ronald Reagan, 1981: first closed-captioning of television broadcast for the hearing impaired
Bill Clinton, 1997: the first time the ceremony was broadcast live on the Internet
Donald Trump, 2017: the first inauguration broadcast live on Twitter.
Eh, for what it’s worth.
National I Want You to Be Happy Day
Yesterday was National I Want You to Be Happy Day because the folks at–you guessed it–The National Day Calendar say it is. It should be spent doing things that make others happy. A flower here, a silly knock-knock joke there. Buy the person’s coffee standing in line behind you. Remind your kids how much you love them. Leave a sticky note for a co-worker telling them to have a spectacular day, a happy day. Draw a happy face in the snow for a stranger to come across later. Give someone a hug. Putting a smile on someone’s face tends to put one on ours, too.
There’s a great deal of frustration…sometimes…trying to make someone else happy, as we have all experienced. Smiling and telling a joke to someone who just got bad news of any kind can elicit poor reactions. Flowers delivered to allergy sufferers can be deadly. Donuts for the work gang the day of a mass layoff can make the event flat. But sometimes someone, like the illustration to the right, just can’t do “happy” as others do. Its occasions like that, and circumstances like that, when the most positive-thinking folk just move on, and hope for the best.
Hope yesterday was at least reasonably happy for everyone.
Dragging our way through February in the Great Lakes…why do we live up here? Snow, ice, cold wind. The only good thing about it is that it does make spring look that much better.
On 25 February 1933, the Navy launched the aircraft carrier USS Ranger, named after a renowned Revolutionary War vessel (as most US pre-WWII carriers were). As the fourth US Navy aircraft carrier, her hull number was CV-4. Smaller than the two previous 36,000-ton carriers of the Lexington class and the next, the 20,000-ton Yorktown class, 14,500 ton Ranger was, like so many warships in the 1930s, a compromise to stay within Washington Naval Treaty requirements. She was more notably the US Navy’s first ship designed from the beginning as an aircraft carrier. Everything about Ranger was a learning experience, including her pre-1939 deployments in Latin America, the eastern Pacific, and Alaska: she was the first aircraft carrier to launch and recover aircraft under Arctic conditions. Designed to house and launch as many as 76 planes, Ranger was also the first to get Grumman F4F-3 Wildcats for her fighter squadron in October 1940.
Because of her size and geared turbines, she lacked the range and speed to operate in the Pacific. Pearl Harbor found Ranger returning to Norfolk from a Neutrality Patrol off the Carribean. Ironically, the US Navy’s smallest “fleet” carrier (a designation developed during WWII, she wasn’t referred to as that) was the largest aircraft carrier in the Atlantic Ocean in 1942, spending much of her time as an aircraft ferry, even though she still took part in the naval battle of Casablanca 8 November 1942. Ranger was famous enough for the Germans to have claimed to have sunk her with torpedoes in April 1943–when she was in drydock. She spent the last half of 1943 as part of the Royal Navy’s Home Fleet, participating in a raid on Norway known as Operation Leader on 4 October.
The Norway raid was Ranger’s last combat operation. A plan to lengthen and modernize her in 1944 was abandoned as not worth the resources. She spent the rest of the war as an aircraft ferry and training carrier, once again venturing into the Pacific as far as Hawaii. In 1945 Ranger trained carrier pilots for night intercepts and transported returning personnel. She was decommissioned in 1946 and scrapped in 1947.
On 25 February 1945, the US Navy’s Task Force 58, consisting of 11 fleet and five light carriers, turned away from their ravaging of Japanese airfields that had begun 16 February in support of the Iwo Jima landings that began on 19 February. Though the numbers are fuzzy, there may have been as many as a thousand US planes involved in the attacks, resulting in a claim of over 400 Japanese aircraft destroyed to less than a hundred US losses. These attacks on the Japanese Home Islands were not undertaken with impunity, for the Japanese responded with kamikaze and conventional air attacks. It is interesting to note that Ranger’s predecessor, USS Saratoga (CV-3), then the oldest operational aircraft carrier in the world, was among the fleet carriers attacking Japan, and survived a kamikaze attack on 21 February 1945. It is also interesting to recall that Saratoga was expended at a nuclear target in 1946 and that her hull was still intact as late as 2011.
National Tell A Fairy Tale Day
National Tell-A-Fairy-Tale Day is tomorrow, 16 February, once again because the good folks at the National Day Calendar say it is. Fairy tales, as we all know, are supposed to be fanciful renditions of what were once grim moral folk stories told for the benefit of children that since the late 19th Century have always ended with “and they all lived happily ever after.” According to the Australian Fairy Tale Society: “Once upon a time, the people tried to define fairy tale. They are still trying.” Their website suggests the modern fairy tale hearkens back to ancient mythology, and I’ve got nothing to dispute that. Yes, there really is an Australian Fairy Tale Society: click on the link above if you don’t believe me.
But tellers of fairy tales aren’t just in children’s books. They include salesmen of all sorts, especially of used cars, life insurance, and retirement investments. They are also tort lawyers, publicists of all stripes, and marketing and advertising copywriters. Included in this group are, of course, the mass media of both “wings” of American discourse: those at left are merely the most notorious.
The most pernicious, however, are the tellers of fables among elected officials (which would be nearly all of them) and their hangers-on, all of whom scream that they are scrupulously honest right up to the election day. The image on top is, of course of those famous tellers of fairy tales, President Clinton and Wanna-Be-President Clinton. We all remember Wille Jeff’s memorable nationally-televised and emphatic finger-pointing telling of “I did not have sex with that woman, Monica Lewinsky,” and Hilly Rod’s spookily animated “it was the video” fable in 2012, and the serial denials that she told it afterward…and that Congressional hearing? Epic fable-telling at its best, right up there with Nixon’s “I am not a crook.”
I appreciate that February only has four weeks, but must they be as long as they are? What? They’re the same length as those in June? No, can’t be. You’re making that up. Next week is National Tell Me A Fairy Tale Day, not this week.
Well, given that…Julius Robert Oppenheimer (known either as Robert or Oppie) was born on 22 April 1904 in New York City to an affluent first-generation German Jewish immigrant family in the textile business. Young Robert’s early education was typical of secular Jews in Manhattan, but untypically Robert was introduced to the quasi-Christian/quasi-atheist Ethical Culture movement in the primary grades, which may have informed his life thereafter.
Young Robert, like the physicist Oppie would become, had a wide-ranging mind and interests that ranged from English and French literature to horseback riding and mineralogy. At Harvard he majored in chemistry but gravitated to physics, graduating summa cum laude in just three years. He moved on to Cambridge and to the University of Göttingen, where he earned his PhD in physics at the age of 23 while he developed his most cited work, the Born–Oppenheimer approximation in quantum physics.
After university, Oppie (from a Dutch-derived nickname) was much sought after, dividing his time between the California Institute of Technology (Cal Tech) and Harvard for a year. Known to work himself to exhaustion while still in school, he was also subject to fits of depression. He had no spare time, dividing himself between hard science and Eastern philosophy, learning Sanskrit on his own so he could read the Bhagavad Gita in the original. He supported communist ideals in the ’30s while supporting refugee scientists from Europe–including many who he would work with on the atomic bomb. At the same time he practically ignored the world around him, and was unaware of the Wall Street crash of 1929 until he was told about it two years later.
Morally Oppenheimer could be described as a mess. He had an affair with divorcee Jean Tatlock, a fellow communist sympathizer who really was a communist, that supposedly ended in 1938. Then he married another communist, divorcee Katherine Puening, who had two children with him while he was still sleeping with Tatlock from time to time. All of this made for interesting reading and constant surveillance while he worked on the most secret project of WWII.
Oppenheimer joined the Manhattan Project in 1942 as the head scientist, in part because he was a good referee. Oppie was the one world-class physicist among all those in the Western world who all the others would work for. The military head of the project, Army brat Leslie R. Groves, had been the overseer for the building of the Pentagon and was regarded as one of the best civil engineers in the Army. The unlikely pair–unruly and eccentric Oppenheimer and straight-laced career soldier Groves–got along famously and became lifelong friends, though Groves was always concerned about Oppenheimer’s serial indiscretions.
In July 1945 Oppie and thousands of other workers watched the Trinity device–a plutonium core nuclear bomb–ignite the very air in the New Mexico desert. His reaction over the years has been described and reinterpreted, but the one that makes the most sense was relief. After all, the Manhattan Engineer District had cost the United States $3,000,000 for every man, woman, and child in North America. There were scientists who thought that the thing wouldn’t work and others who thought that it would cause a continuing chain reaction and consume the Earth. If he thought of the Bhagavad Gita at that instant we can’t know now, even as much as he said he did later.
His early flirtations with communist organizations, direct or not, caught up with him in the coming Cold War. He lost his security clearance in 1954, and his situation was made worse by his pronouncements that science knew no politics. As the head of the Institute for Advanced Studies at Princeton from 1947 until 1966 when he began chemotherapy for throat cancer, he encouraged research of all kinds and consistently spoke for harnessing science for the public good. J. Robert Oppenheimer died 18 February 1967, eulogized by many and remembered as a cautionary yet unapologetic voice for the harnessing of science for the public good.
President’s Day 2019
The appellation “President’s Day” originated in the Uniform Monday Holiday Act of 1968, which took effect in 1971. The Act established Washington’s birthday as the third Monday in February, in the week of 15-21 February; it didn’t officially merge Washington’s and Lincoln’s birthday, nor did it establish a “President’s Day” by law…pop culture seems to have done that all on its own.
The photo up top is of a unique club: five men who get to be called “Mister President” at the opening of the Nixon Library in Yorba Linda, California. There’s another to the right: a hurricane relief concert in 2017 when there were five living former presidents. WIth the incumbent in the White House, this was a period during which there were six living members of the President’s Club, which has happened four times. There have been six periods when there were no living former presidents:
When George Washington died in 1799;
When Andrew Johnson died in 1875;
When Grover Cleveland died in 1908;
When Calvin Coolidge died in 1933;
When Lyndon Johnston died in 1973.
Richard Nixon is the only person to have been both the only living US president (January 1973, after Johnson died, to August 1974, when he resigned) and one of six living presidents (January 1993, after Clinton’s inauguration, to his death in April 1994). Everyone should be known for something odd.
Mid-February: still ice everywhere despite what may have come as a thaw in late January–it often does for a few days in the Great Lakes. Then it drops down to sub-zero again, even after Groundhog Day. Don’t expect the mall snow piles to be gone for another couple of months. Climate change my royal behind…
Today is many things, but the three most important, in order are:
My granddaughter Madeline’s birthday, and I won’t say how old she is other than she’s eligible to vote and can drink legally in any state of the Union;
Foundation Day in Japan, celebrating the traditional beginning of the Empire of Japan in 660 AD;
Constitution Day in Japan from 1889 to 1947.
The Constitution of the Empire of Japan, known colloquially as the Meiji Constitution, was proclaimed on 11 February 1889 by the Meiji Emperor Mutsuhito and became effective on 29 November 1890. Before then Japan, like many other states, had no written constitution but a body of law, traditions, and habits for formalizing the government and institutions. That it was issued on Japan’s Foundation Day was not a coincidence, because it in effect reinforced the fact that, though many changes social and technological changes were sweeping across Japan in the late 19th century, the emperor and the samurai were still on the top of Japan’s heap.
One of the salient features of the Meiji Constitution was that it was in effect the Emperor’s to obey or ignore at his discretion. The US Constitution outlines a structure for a government, then goes on to limit the powers of that government. Whatever the Emperor did under the Meiji Constitution or–more ominously–whatever was done in the Emperor’s name was OK. It was his to dispose of or obey at will. In practice, the Diet was used to raise taxes and pass civil laws, and the courts were there to legitimize governmental actions. The primary restraint to any of the three emperors who reigned under it was that it provided a veneer of Western appearance that was almost universally recognized. The Meiji Constitution thus made many in the West (those who never read it) believe that Japan was just like them.
The Meiji Constitution set up a government, allowed for a politically-chosen and elected Diet (analogous to the House of Representatives), and an upper house of nobles (more like Britain’s House of Lords than the US Senate), a chief executive (Prime Minister) and a cabinet to control governmental functions. It then states that the Prime Minister was to be appointed by the Emperor on the advice of a privy council and the Genro of elder statesmen (who were all men) and they didn’t need to be members of the elected Diet–and in practice they rarely were. Thus, the government of Japan was not necessarily responsible to the electorate.
Finally, the Meiji Constitution made the military co-equal with the civil government, in effect making it a fourth branch of the Emperor’s controlling apparatus. If the military didn’t send a minister–a War Minister/general from the Imperial Japanese Army and an admiral/Navy Minister from the Imperial Japanese Navy–to a Prime Minister’s cabinet, or if one service or the other withdrew its minister, the government had to be dissolved and, usually, another Prime Minister chosen. The military did this when they didn’t get their way over and over again right up to 1941.
Generals and admirals were Prime Ministers about a third of the time between 1890 and 1945. From the outbreak of WWII in Europe to the surrender, “political parties” in Japan as they were understood in the West ceased to exist, replaced by the Imperial Rule Assistance Association (IRAA), an amalgamation of all Japan’s political parties (except the communists, founded in 1922, which was repressed) into one. By that time it no longer truly mattered what the civil government had to say: the decision to attack the West in 1941 and all the planning for it took place under Konoe Fumimaro, the last non-military Prime Minister before August 1945.
The Meiji Constitution was technically revised in 1947, but it was an entirely new document under its same official name, fashioned after the US Constitution by the occupation forces. In it, the Emperor was reduced to a figurehead; the Prime Minister elected by the Diet, and the use of force and role of the military severely curtailed. No longer a veneer, Japan’s Constitution may one day end up as a model for a UK Constitutuion, should they ever write one.
Valentine’s Day 2019
Yeah, OK, a greeting card holiday. Ain’t we tired of saying that all the time, every year? Yeah, it “celebrates” an execution–maybe (there are at least three saints named Valentine who were said to have been executed on 14 February). Only Medieval legends had such a saint performing marriage rituals, not any contemporary accounts doing so in their Third Century lives. Then what? Hey: it’s mid-February, time to feel warm about someone else.
Family loves you if you think you deserve it or not. You only have to love them back.
Yeah, this is another plug for another book. That’s what this blog is for. This one’s not about anybody getting killed, though it does involve the military. Tideline: A Story of Friendship should be ready mid-year. As my loyal readers (all three of you) know, Tideline is about two people growing up in the ’50s and ’60s in suburban Detroit. They spend their teen and most of their young adult lives apart for reasons beyond their control. Yet, for all those years–nearly half their lives–they never entirely forget each other.
A friend will help you move; a buddy will help you move a body.
He joins the Army for a lot of reasons; she, the Navy for just as many. They run into each other in 1985 in the book’s first prime location, Key West Florida. They learn of each other’s lives again, yet their services could rip them apart at any time. But these two survivors of the Summer of Love (1967) cannot resist the temptations of the flesh without deciding on limits to their passions: a tideline.
TIdeline is a story of enduring friendship, heartache, and joy, adventure, and romance, of trust and two people’s grim determination to not just stay together but to convince their services to allow them to stay together and keep their careers. It was a time of chaos in the ’60s when the streets flooded with protests, and the ’80s iin the turmoil of widespread social, legal and structural changes in the post-Vietnam US armed services, and when “social media” was still a written letter.
Hey, not bad, eh? Look for Tideline about mid-year…probably.
February in the Great Lakes: still in the depths of winter, but there are rimes of ice around everything by now. Feet don’t fail me now, we gotta make it until spring.
On 4 February 1818*, Joshua Abraham Norton was born somewhere in England (Deptford, near London, is the best candidate). At the age of two, he and his family moved to South Africa as members of the 1820 Settlers, sanctioned by the British government. Not much more is known about him until he arrived in San Francisco on 23 November 1849 with some capital in his pocket–though how much is unclear and at this point unknowable. What is known is that he was a savvy investor who did well in real estate and commodities, and by 1852 was one of the wealthiest people in San Francisco, which for that time and place is saying a great deal. But he tried to corner the market in rice and lost his shirt doing it, filing bankruptcy and was living in a boarding house by 1859.
On 17 September 1859, Norton declared himself Norton I, Emperor of These United States and Protector of Mexico
Well, good for him: everyone needs an emperor now and then. Having been a prominent businessman just months before, the San Francisco newspapers were more than happy to publish his announcement, and like many others regarded him as a harmless crank. It didn’t take long for His Majesty to start issuing proclamations removing and appointing the governor of Virginia (17 September 1859), dissolving the United States (16 July 1860), forbidding Congress to meet in Washington, D.C. (1 October 1860), and abolishing the Democratic and Republican parties (12 August 1869). All the while, newspaper editors attributed scores of other declarations and decrees to him, few of which were his, many were amusing, others quasi-serious.
While it was clear to nearly everyone that the guy was unhinged, it seems that he was indulged well beyond what would be tolerated a century later. Police and militiamen saluted him on the streets; society swells doffed their hats and curtsied; restaurants fed him at no charge; workmen would stop their work as he inspected; his boardinghouse used him as free advertising. Even the 1870 census listed his occupation as “emperor.”
But all good things must come to an end, and on 8 January 1880, Emperor Norton I dropped dead at California Street and Grant Avenue, on his way to a lecture. He was adequately eulogized in the newspapers and buried at the Masonic Cemetery on 10 January. The procession was two miles long, attended by some 10,000 mourners. Since then his body was moved to Woodlawn Cemetery, where it rests today, maintained by the city of San Francisco. The City by the Bay has never hesitated to suck whatever whimsey or publicity out of Norton that it could. Persons playing the role give tours of the city, dressed in whatever imaginary finery they could imagine. Businessman, crank, emperor and tour guide: quite a career.
*The year of Norton’s birth is in dispute: as early as 1814 or as late as 1819.
Create A Vacuum Day
Today is National Create a Vacuum Day because the good folks at the National Day Calendar say so. As we all know, vacuums (the atmospheric state) are an absence of, well, principally gas. So why is it my vacuum cleaner is always full of something–hair, dirt, paper, dust or I just don’t know what? That’s because what we call a vacuum is an absence of air. A true or perfect vacuum which is devoid of all matter is only theoretical and can’t actually be created, exist or be detected…except by the US Congress, which makes one practically every time they meet.
Now, according to Doc Elliot’s Mixology, whose graphic I borrowed above, the reason your drink shaker creates a vacuum when you shake it is because the contents of that shaker cool and contract, creating a partial vacuum that holds the lid on. Have to take their word for it because I don’t do that: hard liquor just doesn’t appeal to me.
This look is something new: a new editor for WordPress that seems easier to use. Every paragraph, graphic, quote or what-have-you here is a separate, reusable block. It makes moving the elements around easier, but I’m nowhere near bright enough to take advantage of all this stuff. So it might look different, but it’s still my same old rambling.