When England decriminalized heresy

Before we can talk about when heresy stopped being a crime in England, we have to talk about when heresy became a crime. And before we can do that, we have to talk about what heresy was. Or is, if you like. And before we can do that in any thorough way, we need to gather up all the time the world has available and use it to study the subject.

Which we’re not going to, predictably enough. We’ll take the shortcut. 

 

Heresy

For heresy to exist, you need orthodoxy: a set of fixed beliefs–preferably religious and powerful–to not believe in. Or since no one will know what you believe unless you make it public, to disagree with. No public disagreement, no heresy. 

Mind you, you don’t have to be the person making your disagreement public. Some other helpful soul can do that for you. It may or may not match any belief you recognize, but good luck proving that. However it happens, the public ingredient has to be added if we’re going to follow the recipe.

Although multiple religions have their battles over orthodoxy, the word heresy is heavily linked to Christianity, although its root comes from a word with no particular negative connotations that meant cult. From its early days, though, the Catholic Church got to work deciding what the one and only correct set of beliefs was and organizing a way to divide correct beliefs from heretical ones, at which point heresy became ba-a-a-ad.

 

Irrelevant photo: I thought, given the topic, we might need something nice to look at. These are begonias.

In spite of that, believers kept coming up with new and more interesting heretical beliefs. Consult Lord Google on the subject and he’ll lead you to lists: the top 10 heresies of Christianity; the 6 great heresies of the Middle Ages; the complete list of heresies in the Catholic Church. The internet loves lists, especially numbered ones. 

I don’t particularly. We’ll skip them. 

If you’re serious about this, you’ll make a distinction between heresy and schism. I’m not and we’ll skip that too.

I know. You’re disappointed. But let’s define heresy: Canon Law–the ecclesiastical law of medieval Europe–defined it as “error which is voluntarily held in contradiction to a doctrine which has been clearly stated in the creed, and has become part of the defined faith of the church,” and which is “persisted in by a member of the church.”

 

Heresy as a crime

It’s a big jump, though, from heresy being something the church isn’t happy about to heresy becoming a criminal offense. The first step toward that–

Hang on. Can you step toward a jump? Oh, probably. That’s close enough not to mess up my metaphor. 

Onward.

The first step is that state and religion have to meet, mate, and intermingle their DNA. Be careful when someone proposes that, no matter how many good intentions the proposal’s dressed in. It gives religion the state’s power in punishing heresy and the state a claim to god’s endorsement, which in theory at least means it can’t be challenged. On anything. Because god said so. Neither of those things has, historically speaking, brought out the best in the people involved. 

Before Christianity mingled its DNA with state power, back when it was itself a persecuted minority religion, it advocated freedom of conscience and freedom of speech. Once it merged with the state and saw the error of its ways, freedom of conscience lost its appeal and it started to impose its views on everyone, since they were, by definition, the right ones.

But even if you have the authority to stamp out heresy, how do you establish who the heretics are? Let’s quote from T.M. Lindsay’s “Heresy” in the Encyclopedia Britannica, 9th edition, which is in turn in the online encyclopedia. Theodora, (The link’s above if you want it–or if you don’t.)

“Pope Innocent III [1198  to 1216] declared that to lead a solitary life, to refuse to accommodate oneself to the prevailing manners of society, and to frequent unauthorized religious meetings were abundant grounds of suspicion; while later canonists were accustomed to give lists of deeds which made the doers suspect: a priest who did not celebrate mass, a layman who was seen in clerical robes, those who favoured heretics, received them as guests, gave them safe conduct, tolerated them, trusted them, defended them, fought under them or read their books were all to be suspect.” 

To be on the safe side, lay people could follow Pope Alexander IV’s advice and simply not argue about matters of faith. Basically, they could shut up and believe what they were told.

The church had its own laws and legal system, and it could inflict churchly punishments on people–things like excommunication–but you know how sometimes you just feel the need to hit things with a larger hammer? It was like that, and as early as the fourth century, you’ll find Christian emperors putting heretics to death, and as time whent on we can add torture to the list of risks heretics faced. Or people presumed to be heretics.

By the thirteenth century, witchcraft was getting mixed up with heresy (it all had something to do with the devil, so why not?), and by the fifteenth century the belief had taken root deep enough in the culture that when Protestants started rejecting Catholic beliefs, witchcraft never made the reject list. 

 

But weren’t we supposed to talk about England?

Sorry, yes, we were. I was trying to find a starting point for heresy being a crime and ended up on a European tour, but let’s come back to England. 

In 1401, Henry IV introduced de Heretico comburendo into common law, allowing a diocesan to sentence a heretic to burn at the stake without consulting the synod or the crown. 

What’s a diocesan? As far as I can figure out, the bishop in charge of a diocese. What’s a synod? A church assembly of one sort or another. Listen, this isn’t my world and that’s the best I can do at short notice. The point is, the diocesan could pronounce sentence on his own say-so, and the sheriff then had to light the match, or its era-appropriate equivalent, since matches hadn’t been invented. So church and state were joining hands in getting rid of those pesky heretics, who kept cantering past, in spite of centuries of effort to keep everyone inside the orthodox corral.

What’s common law? Oh, please. You don’t want to get into that. At least not now. Can we just pretend we know what we’re talking about and move on? What matters is that being condemned as a heretic was enough to get you killed. For both church and state, that worked well until–

You know how I said that in order to have heresy you have to have orthodoxy? Well, this business about killing heretics became problematic when the definition of orthodoxy turned from a solid to a liquid–in other words, during the Reformation. When England was Catholic Protestants were heretics. But then England became Protestant, and Catholics were the heretics. But nothing’s ever quite that simple, because more Protestantly Protestants were also heretics. And just when you thought you had all that figured out, England went Catholic again. Then it went Protestant again. Then it fought a civil war but both sides were Protestant, by which time everyone was so dizzy they had to sit out a few dances.

While they’re out from underfoot, let’s backtrack a bit: Under Elizabeth I (Protestant) heresy laws were repealed, but the authorities could still dig out that old bit of common law, de Whatsitum in Latinensis,  and get the sheriff to light the match. Which still hadn’t been invented. So the repeal was incomplete.

In 1612, Edward Wightman was burned at the stake for heresy. This happened when James I and VI was the king (Protestant, and he was a single person, not two, in spite of all the numbers he had to drag around). Wightman was the last person burned for heresy in England, but he didn’t know that and for a long time no one else could count on it either.

Now we move forward again. Our dancers have recovered, the civil war’s over and the king’s Protestant. The country’s therefore Protestant, as is parliament. But wait, the king’s heir is Catholic.

Let’s pause there for a moment. The king still had the power to tip the country from  one church into another and no one in power can be sure they wouldn’t be looking at a pile of dry wood and a sheriff fingering his imaginary matchbox when power shifts. They don’t know that Wightman’s the last of his particular brand of martyr.

I know, I’ve changed tenses. Doesn’t it all feel exciting in the present tense?

So in 1677, parliament abolishes burning at the stake as a punishment for heresy. This effectively decriminalizes it. And, not incidentally, takes away a powerful weapon that any future Catholic government, should there be one, could use against their nervously and publicly Protestant selves.

To placate the bishops in parliament, they allowed bishops and ecclesiastical courts to continue punishing heresy, blasphemy, and atheism under ecclesiastical law, but not by death. 

Defenses of the freedom of conscience were in the air, but they weren’t what moved the lawmakers. Self-preservation was.

A quick history of the Chartist movement

Britain’s Chartist movement was one of those inspirational failures that people who try, against all the odds, to change the world love to talk about. They remind us not to count the game as lost until several generations after our deaths. At which point we can pretty well count on not knowing or caring who won.

Okay, that was more downbeat than I meant it to be. The Chartists lost but in some very real ways they also won. 

 

The basics

The Chartist movement began in 1838 with a People’s Charter, drafted by the London Working Men’s Association. It demanded six things:

  • Universal manhood suffrage. At a time when women had only recently been invented, that could almost pass for everybody having the right to vote. 
  • Electoral districts of equal size, meaning all voters would have equal influence. Or that was the theory anyway, and it was quite radical at the time.
  • Voting by secret ballot. That’s right–it hadn’t been instituted yet.  
  • Yearly elections for Parliament.
  • Abolition of property qualification for Members of Parliament.
  • Payment for MPs, which would open up the position to people who worked for a living.

The goal was to give working people political power. In other words, the charter gathered an impressive list of enemies. 

The ideas weren’t entirely new–you can find a lot of them threaded through English history–but it was new that in spite of some middle-class and gentlemanly leaders, the movement’s base was in the working class.

Irrelevant photo: It’s been a while since we’ve had a cat photo, hasn’t it? This it L’il Red Can, who’s no longer so little but can’t seem to escape his name. He is entirely apolitical.

The background

The movement began at a time when political reform was in the air, aggravating many an allergy among the aristocrats’ delicate breathing systems, since the aristocracy still held political power, although economically they were being eclipsed by industrialists.

In response to much popular campaigning, the 1832 Reform Act had made a few gestures in the direction of cleaning up the electoral system. It gave the vote to small landowners, (some) tenant farmers, (some) shopkeepers, and (some of the more solvent) householders even if they didn’t actually own the property they lived in. It also got rid of a fair number of rotten boroughs–constituencies where almost no one lived but that sent representatives (controlled by the local landowner) to Parliament. 

The Reform Act meant some 200,000 more men could vote, but that was out of a population of maybe 10 million. Admittedly, that included children and women, who so clearly wouldn’t know what to do with a vote if they fell over one, but it still left a lot of men voteless.

This was also a time of economic woe: 1837 and 1838 were depression years. Think low pay, hungry people, and unemployment, all aggravated by an 1834 law that replaced the earlier system of relief for the poor with workhouses. They’d be cheaper. They’d be more efficient. They’d get beggars off the street, attack the moral failings that led people to be paupers, and encourage them to work. 

Doesn’t that sound familiar? 

So, no more handouts just because you were out of work and starving during a depression. The poor would go into workhouses,  families would be separated, their lives would be controlled, and they would be set to work under deliberately harsh conditions.  

Semi-relevantly, the government that introduced this was led by Earl Grey, who gave his name to that elegantly flavored tea. 

It was also a time of rebellion. The Swing Rebellion and movement to defend the Tolpuddle Martyrs were in the recent past.

So working people weren’t in a good mood and it wasn’t irrational for them to think that if they could vote they’d be represented in Parliament in proportion to their numbers, and that would bring about a more just organization of society.

It hasn’t exactly worked out that way, but it made sense at the time. 

 

The story

The Chartist movement centered on a petition that gathered more than 1.2 million signatures at a time when petitions were pieces of paper (you remember paper?) and had to be passed from hand to hand and delivered as actual physical objects.

You remember physical objects?  

To gather those signatures, speakers fanned out across the country, addressing actual groups of people (you remember people?), and all of this running around and meeting and speaking built an organizational framework that brought together English, Scottish, and Welsh radicals, as well as Irish supporters of Home Rule, making it not just a movement of working people but a fully national one. 

Inevitably, different parts of a coalition will pull in different directions, and the most important one was what to do if (or as many expected, when) Parliament rejected the petition. Call a national strike? Rely on moral force? Rely on physical force?

The question hadn’t been settled by the time Parliament rejected the petition, and it probably couldn’t have been. Some coalitions are hard to hold together and talking doesn’t resolve all disagreements. Riots broke out, some of which were intended to turn into full-scale uprisings and at least one of which was set off by Birmingham’s authorities banning gatherings and then breaking up the one that happened–not to mention arresting two of the more moderate leaders.

But let’s not slog through this battle by battle, attack by retreat, riot by gathering. Soldiers were called out. People were arrested–550 of them in 1839 and 1840. People were killed. Leaders were convicted of treason and sentenced to be hanged, drawn, and quartered–a sentence so out of keeping with the times that in the face of protests it was commuted to the harsh mercy of transportation to Australia.  

 

Parts 2, 2 ½,  and 3

The second petition was delivered in 1842. It had twice the number of signatures and Parliament was impressed enough to say, “Why should we care about you? You can’t even vote.”

Okay, that’s not an exact quote but it does catch the spirit of their response.

Violence broke out here and there, and respectable opinion held the Chartists responsible for it, but around the country wages were being cut and in response workers were going out on strike. This was the beginning of what was known as the Hungry Forties. Some Chartists inevitably would’ve been involved, but the strikes were more spontaneous than organized.

No union movement existed to support them, and none lasted long.

Having said that, though, at least one source talks not about strikes but about a general strike–one that had not just economic but also political demands: the adoption of the Charter.

After that we get six years of Chartist energy pouring into model communities of various sorts, generally involving equal ownership of land or assets. Some were trying to make their participants eligible to vote so they could elect MPs to represent them.

A third petition made the rounds and was presented in 1848–a year of revolution in continental Europe. Presenters claimed it had 5.75 million signatures. Three days later, the Commons Committee for Public Petitions said it had counted all the signatures and found fewer than 2 million, some of which–including Queen Victoria’s–were obvious forgeries. 

Feargus O’Connor–an MP, a Chartist elected to Parliament to represent Nottingham, and the person who’d presented the petition–said three days wasn’t enough time to count all the signatures.

Was so too, the committee said.

Was not never, O’Connor said. 

And those aren’t exact quotes either.

O’Connor challenged another MP to a duel, then withdrew the challenge.

It was not the finest moment of the Chartist movement.

The petition–to no one’s surprise–was rejected. A few riots followed and a planned rebellion failed. Almost 300 Chartist leaders were arrested and sentenced to transportation or long imprisonment, although death sentences were again commuted. 

Chartism didn’t die on the spot, but between internal divisions, questions about the petition’s validity, repression, and a better economic situation (which at least one source says didn’t trickle down to rank and file Chartists and therefore was unlikely to have had an effect) it was never again the force that it had once been.

 

Women in the Chartist movement

The Chartist leadership was male, and to the limited extent that women’s right to vote was discussed, the movement backed away from it–on some people’s part because of the assumptions of the day (women belonged at home; women needed the vote almost as much as soldiers needed water wings) and on others’ because it would make the movement too controversial and open it to ridicule, since the idea of women voting was inherently absurd. 

Even so, women got involved. They came from families; they had families of their own. The vote was a weapon that might improve their families situation, even if they didn’t get their own hands on the weapon. So they attended meetings. They raised money. They organized tea parties and boycotted anti-Chartist  shopkeepers. 

A few women leaders did emerge, although they never became as well known as the men. 

I know. You’re shocked. 

Mostly, though, the women worked within their socially acceptable role, pushing its edges outward, and none of what they learned at those edges was likely to have been lost.

Sometimes it’s the right time for that and sometimes it isn’t, and sometimes it depends on what each individual can do. But never underestimate the women who don’t break out. They start out by making tea and worshiping heroes and the next thing you know they want to vote and be heroes themselves.  

 

The aftermath

Chartism continued in one form or another for some ten years after the third petition, but its high point had passed. Some of its leaders–and probably if less verifiably, some of its followers–took their skills to other campaigns. 

The right to vote did expand, but the government wasn’t in any kind of a rush about it. Before 1918, only 58% of adult men could vote. That year, property restrictions were abolished for men and women over 30 were given the vote–but they still had to own property. It was 1928 before women could vote on equal terms with men.

As for the other demands:

  • The secret ballot was introduced in 1872.
  • These days, constituency borders are regularly redrawn to keep them of roughly equal size–sometimes controversially, but the principle is there. I’d love to tell you when that started, but I got bored witless before I found an answer.
  • The property qualification for MPs was abolished in 1857, but it didn’t become a paid job until 1911. 

That only leaves one of the Chartists’ demands unmet: yearly elections for MPs.

*

In addition to the links, I’ve also relied on David Horspool’s book The English Rebel

Medieval England’s piepowder courts

In the Middle Ages, English fairs and markets had a fast-acting justice system called–well, what it was called sort of depends on how you want to spell it, and then your best guess about how to pronounce it. This is English, remember. Pronunciation and spelling aren’t often on speaking terms, and in the Middle Ages spelling was still a liquid–years away from taking on a fixed form.

The spellings I ran into most often were pie poudre and piepowder, but the West Sussex Records Office adds “pyepowder, pipoulder, pepowder, and pipoudre,” and notes (gleefully) that the spelling sometimes changed within the same document. How do you pronounce it, then? I consulted Lord Google, as I so often do, using the pie poudre spelling, and he led me to a website that asked if the phrase was Catalan, Mandarin, or Australian English. It didn’t matter what I chose, though, because it couldn’t actually hack up a pronunciation in any of them, but that was fine since by then I’d pretty well lost my trust in it.

YouTube, however, looked me right in the eye and swore the correct pronunciation is pie powder. I have no reason to think YouTube knows what it’s talking about, but let’s go with it anyway. It’s hard to remember a set of letters unless your brain can tack a pronunciation onto them. Or that’s how my brain works, anyway. When it works at all.

However you pronounce and spell it, though, we’re not talking about an instant pie mix. The name came from the French for dusty feet, pieds poudrés, or so Lord Google, in the authoritative person of the Encyclopedia Britannica (with a little help from the West Sussex Records Office), assures me. 

We’ll come back to that and I promise it’ll almost make sense, 

Irrelevant photo: a California poppy after the rain.

What was the piepowder court?

It was the lowest level of common-law justice in medieval England. As the Britannica puts it, it was constituted by merchants. It then defines constituted in several different ways, leaving me to wonder if it was made up of merchants or if merchants organized it or if they actually established it. 

Screw it. It existed. Merchants were involved. Let’s move on. 

The court dealt with problems that came up at markets or fairs–and medieval fairs, remember, were places where business got done. So they heard arguments about who cheated who, who stole whose spot, and who was a disorderly nuisance. 

The piepowder court would meet for as long as the market or fair lasted, and people could drag each other into court to be judged on the spot by the merchants in charge. With the dust of the market still on their feet.

You get a sense of medieval snobbery from that, don’t you? Dusty feet? The horror!

Let’s go back to West Sussex for its take on dusty feet:

“What this is referring to is most likely the people who travelled to towns from far and wide for market days–travellers and vagabonds. Within modern French, pieds-poudreux is supposedly used for travelling beggars. Another given reason is how it relates to the speedy justice that was administered. Or perhaps another origin comes from how members of the piepowder courts were constantly walking around the markets, the dust coating their feet as they moved.  It’s possible the term references them, rather than the travellers and merchants. In my mind the first is the most obvious answer, but it is likely the true answer is a combination of all three.”

An alternative explanation is that justice was done as speedily as dust can fall from a person’s foot. We’ll probably never know, so take your pick.

 

How did they work?

Once someone accused someone else of whatever, the court had to make its decision within a day and a half. Markets and fairs attracted people from outside their areas, and they couldn’t hang around, waiting for the court to get around to them. 

The piepowder court in Bristol, at least, had three or four judges, and it was up to the accusers to prove their cases. Then the defendants could argue their innocence and present their evidence. This was unusual in medieval courts. They generally relied on oaths. People would bring in a set number of their equals who’d swear they believed the oath.

Piepowder courts could punish a person with a fine or the pillory, and if they didn’t pay up the court could seize their goods.

 

Opening the court

West Sussex still has a record of how to open the court at the Chichester Sloe Fair. 

“Let the Cryer make Proclamation on the South Side of the High Cross as follows – at 8 o’clock:

“Oyez – All manner of Persons that have to do or intend to have to do At the Ancient Pavillion Court of the Right Rev. Father in God Sir Wm Ashburnham Bart. Lord Bishop of Chichr holden on this Day [at the Gate commonly called the Canon Gate] for this City and the Liberties there of with the Fair called the Sloe Fair, for the time and space of Eight days beginning this Day being the Eve or Vigil of the Feast of St Faith the Virgin come forth and give your attendance. God save the King.”

For that, the town crier got a cut of the court’s income.

Yes, of course the court made money. Even if justice is supposed to be blind–and I doubt the phrase wandered into the language this early–it sure as hell doesn’t do its work for nothing. And fairs and markets were all about making money. The Sloe Fair’s income went to the Bishop of Chichester.

A sloe? It’s the fruit of the blackthorn and grows wild or in hedgerows, although History Extra reminds us that hedgerows didn’t really proliferate until the 16th and 17th centuries, with the full blast of the enclosure movement. It looks like an oversized blueberry but I doubt you’d wouldn’t want to pop it in your mouth without cooking it–and sweetening it if you can. Ask Lord G. about recipes and he’ll tell you about sloe gin, about jelly, and about cooking it with meat. Mostly, though, it’s about the gin these days.

 

The end of the piepowder courts

The Chichester court last sat in 1834.

Bristol’s piepowder court was active until 1870 and  Hemel Hempstead’s last sat in 1898. The Courts Act 1971 formally abolished them, which by then was just a formality, and as far as I can figure out without taking on more research, the Administration of Justice Act 1977 did the same thing all over again. 

The Bristol court, by the way, prefers to call itself the Court of Pie Poudre, thanks.

Saffron in Britain: a quick history

People in fourteenth-century Europe were desperate to get their hands on saffron, which they used, among other things, as a medicine against the plague. Or they were if they could afford it, which most people couldn’t because it was wildly expensive, so let’s add “rich” before “people” in that sentence. It was expensive enough that pirates often preferred saffron to gold–it was worth more and easier to lift.

C’mon, even pirates can get bad backs.

 

How saffron got to England

According to legend, saffron got to England as an illegal immigrant, traveling inside a Crusader’s hollow staff. He picked it up, still according to legend, returning from the Middle East by way of Spain, and if you’re a fan of irony, you might enjoy knowing that it was  the Arabs–the people that hollow-staffed Crusader would’ve been fighting–who brought saffron to Spain so he could steal some.

Why did the Crusader (in a sanitized version of the tale,he was a pilgrim) have to smuggle it? Because he’d stolen it. Places that produced saffron wanted to prevent competition, so for example Basel (which admittedly wasn’t in Spain, even during the Crusades) made it illegal to take a corm out of the city and guards protected the plants when they were growing.

A rare relevant photo: The ones in the foreground are crocuses.

Was that true in Spain? Dunno. It’s a legend. Let’s slip that illegal corm into a pocket and move on before anyone notices the geographical switcheroo.

What’s all this corm business, though? 

Well, kiddies, saffron comes from the crocus plant–the Crocus stativus–which grows from a corm. And a corm is what you and I, in our ignorance, would probably call a bulb. The difference is that a corm is–oh, hell, it’s complicated. A corm is rounder than a bulb and it’s solid. That’s enough to let us pretend we know something. 

You can probably smuggle a corm inside a hollow staff if you don’t pound it around too much and if you just happen to have a hollow staff on hand, but whatever happened took place outside the range of the CCTV cameras, so we’ll never know for sure. 

A different version of saffron’s British history has it landing in Cornwall multiple centuries earlier, not necessarily as a corm but in the form of a spice that could be traded again and again for Cornish tin. As far back as three thousand years ago, Cornwall was trading with the Middle East, so it’s entirely possible that tin was traded for saffron, but the ice is getting thin here and we might want to scuttle back to shore before we break through.  

Before I dump a new subread on you, though, I should explain that the word sativus in Crocus sativus doesn’t mean the saffron crocus is related to Cannabis sativa. Sativa or sativus is Latin for cultivated, not for formerly illegal and still mind bending.

 

How to get from crocus to saffron 

So much for legend. What’s clear is that saffron arrived in England (and by this time Cornwall was part of England), and from the fourteenth century onwards it was an important commodity. It was used in dying, in cooking, and in medicines, and (sorry to repeat myself) it was and is incredibly expensive. These days, it’s the world’s most expensive spice. 

That’s not because it’s rare or hard to grow–make a crocus plant happy and it will spread all on its own–but because you only use a small part of it to make saffron. According to the Britannica“What we use . . . is actually the stigma (plural stigmata)—the pollen-germinating part—at the end of the red pistil, the female sex organ of the plant.” 

Harvesting those tiny little sex organs (try not to think about it; you’ll be happier) involves crawling along the ground and cutting a very low-growing flower, then throwing away most of it. Along the way, you have to separate the stigmata (each plant has three) and their stems (those are the pistils) and dry them. 

Do that with 75,000 plants (or 150,000, depending on your source) and you’ve got yourself a pound of saffron. In 2018, that pound sold for $5,000. 

The next most expensive spice, vanilla, sold for $600.

 

Could we get back to English history, please?

Fine. If we can agree that the stuff’s expensive, we’re ready to go back and look at it as a luxury item.

Starting in the fourteenth century, England became a major producer of saffron, and the chalky soil of Essex and south Cambridgeshire turned out to be well suited to it. Smallholders–people raising crops on small amounts of land–who’d once been subsistence farmers planted it as a cash crop, probably not replacing all the crops they lived on but as an addition. An acre planted in crocuses could bring in £6–a hefty amount of money at the time. Saffron became so important to the local economy that the town of Chipping (or Chepyng–they couldn’t spell for shit back then, but it  meant market) Walden changed its name to Saffron Walden.

According to the historian Rowland Parker, successful cultivation depended heavily on unpaid labor, which was a major part of the farm economy for a couple of the centuries we’re talking about. Serfs owed labor to their lords. Smallholders had families, preferably large ones. 

I relied on WikiWhatsia for that. I avoid it when I can, but I’m tired this week and can’t be bothered. My apologies to the world at large. In general, it’s as reliable as the grown-up encyclopedias, but when it fucks up it can do it spectacularly. And I did confirm a few bits, so the entry looks reliable, at least at the moment.

The Cambridge colleges used saffron heavily. Smallholders who rented land from them could pay their rent in it, and some of the colleges used it to pay their own bills, making it a kind of currency. 

But currency or not, academics also used it in food and as medicine. And they sprinkled it on floors and tossed it into their fires (talk about burning money) as a disinfectant. That was probably just a few academics–the richest ones, making a point of being the richest ones.

 

Nothing lasts forever, though, does it?

Change came in response to several things. As the spice trade grew, other offerings became available, and they weren’t only new and exciting, they were cheaper. The elite could spend their money on vanilla, tea, chocolate, and coffee. All of those were outrageous luxuries for a while.

Saffron? That was so last century.

Synthetic dyes also began to replace natural ones. And as the wage economy grew, people left the countryside and that pool of unpaid labor wasn’t around to dip a seasonal bucket into. Growers replaced saffron with the newly introduced crops: potatoes and corn. 

Corn? Sorry. I’m still basically American. The British call it maize, since they call pretty much any old grain corn

If that list of changes doesn’t sound like enough to explain saffron’s decline, consider the Puritans, who wandered in to disapprove of this saffron-burning culture of excess. They wanted their clothing plain, their food plain, and their fires unbothered by show-off gestures. 

Saffron cultivation and usage declined, but in Cornwall, saffron buns and saffron cakes are a long-standing tradition. 

How long-standing? The sources I’ve found hide behind some vague wording about them being traditional, which means they don’t have to commit themselves on how far back the tradition goes.

 

Saffron Buns

I haven’t posted a recipe in an age, but I do make a mean saffron bun–and if you don’t speak American, mean in this context is a good thing. In spite of my accent, they sell well at bake sales and the local farmer’s market.

Don’t be put off by what I said about the cost of saffron. You won’t be buying it by the pound. All you’ll need is a pinch. 

 

Ingredients

A large pinch of saffron

300 grams of bread flour (or whatever substitutes for that where you live)

65 grams of butter, softened

25 grams of sugar

1 tsp yeast (use fast acting–it’s easier)

90 grams currants (or raisins if need be)

45 grams of candied peel (I never do get around to adding this)

Milk (the recipe I started with calls for 120 milliliters, but I always need more)

 

What to do with the ingredients

Crush the saffron and soak it in just enough boiling water to cover it. Cut the butter into the flour. Mix in the sugar, salt, yeast, and fruit. Add the saffron, in its water, and enough milk to form a dough. Don’t let it get too wet, because the buns have to hold their shape. 

Knead it until it’s silky–about 10 minutes by hand, about 5 in a mixer. Cover and let it rise. How long will depend on the temperature of your kitchen, but if you have to punch it down and let it rise again, it’ll be fine. 

Cut into 8 pieces and form into rolls. Bake them on a cookie sheet–called a baking tray in Britain–and use greaseproof paper or baking parchment if you have it. Otherwise, oil the tray. 

Let them rise half an hour or so, until the dough has a little spring in it.

Bake for 20 – 25 minutes at 170 C. (that’s 350 F., give or take a bit). To check if they’re done, turn one over and tap the bottom. It should sound vaguely drumlike.

Cool. Butter. Eat. Toast if that appeals to you.

Why Britain’s days off are called bank holidays

When Britain takes a day off work, it calls the day a bank holiday. England has eight of them, Scotland has nine, and Northern Ireland has ten. Or at least, that was the 2020 count. The queen can add one if the mood takes her, and she’s done exactly that for the 70th anniversary of her queenship. 

Why don’t we get a separate count of holidays for Wales and Cornwall? Because they’re still tucked under England’s wing, and every so often, if you listen carefully, you’ll hear a bit of uncomfortable squawking and rustling under there.

Entirely relevant photo: This is Fast Eddie (in slow mode). He doesn’t have to wait for a bank holiday to take a break.

What do banks have to do with not working? 

I’m so glad you asked. Bank holidays were introduced by the first Baron of Avebury, whose real name was John Lubbock. In 1871 he drafted the Bank Holiday Bill, which true to its name had a limited scope: It was about holidays for banks and financial buildings. 

Buildings? Let’s assume they mean institutions. Buildings go on being buildings even when they’re empty and the doors are locked.

Listen, I only write this shit. I’m not what you’d call responsible for it.

If Lubbock sounds like Santa Claus, handing out days off work, he wasn’t. Bank holidays started before he came along, although I’m not sure they were called that. The Bank of England, the Exchequer, and other public offices took days off for royal events, Christian holidays, and assorted saint’s days (which I’d have lumped into the Christian Holidays category, but see above for me not being responsible). Add them all up and you got around 40 of them. 

In 1830, that was cut back to 18, then cut to 4 in 1834. But a precedent had been established.

 

What did Lubbocks’s act really do?

Read the small print and you discover that the act wasn’t so much about creating holidays as it was about making sure that banks didn’t get penalized for shutting down on a weekday. Any financial wheeling and dealing was postponed till the next day. Bills and promissory notes that were due on bank holidays wouldn’t be due until the next day. But in the process, it standardized the days that were protected that way.

Now can I confuse the picture for a minute? Please? 

Having told you about the many holidays banks used to take, let me quote another source that acknowledges them but also says that before the act banks couldn’t close on a weekday because they’d have been risking bankruptcy. You figure out how to fit those two together. I’m lost.

Over time, shops, schools, other businesses, and the government itself started closing down on bank holidays, but everyone still calls them bank holidays. 

 

A bit of background

The industrial revolution–and the act came along in the middle of it–lent some oomph to the standardization of holidays. It was cheaper for a factory to shut down on a given day, or even for a given week, than to have people wander off wherever they wanted to. 

Not that they could’ve wandered off without getting fired, mind you. But even the great industrialists–those fine folks who kept both adults and children working eighteen-hour days for the most minimal pay–couldn’t keep them working 365 days a year. Among other things, holidays had a religious origin, and theirs was still a religious culture. 

Some things, even the industrialists couldn’t face down. Religious tradition was one of them.

 

Enough about the holidays. Let’s talk about Lubbock

Lubbock’s other claims to fame are that he was a science writer, a banker, and a politician. We can assume it was the collision of those last two claims that led him to think of standardizing bank holidays.

His science writing was more than just a rich man’s hobby. He published books on archeology, entomology, and animal intelligence, and it was in relation to that last subject that, as you’d expect from someone so sober and well connected, he tried to teach his poodle to read flash cards. The Britannica says his book “established him as a pioneer in the field of animal behavior.” 

You can go tell that to my dogs. In spite of his experiments, they remain woefully illiterate.

In his writing on archeology, he introduced the words Paleolithic and Neolithic to the world, and in the spirit of high-minded racism, he titled his book Pre-historic Times, as Illustrated by Ancient Remains, and the Manners and Customs of Modern Savages. It was “probably the most influential archeological textbook of the nineteenth century.”

I don’t suppose I need to comment on that.

Having already become a baronet when his father died, he was later made a peer and took the title Lord Avebury, after the stone circle near Stonehenge that he bought in order to protect it from builders.

Okay, he bought the land it stood on. They tossed the stones in for free.

It’s a hell of a stone circle. If you’re in the neighborhood, do stop by.

A note about that newsletter I claimed I was going to send

To those of you who were kind enough to sign up for my alleged newsletter, I have to report that there won’t be one. It’s a complete flop. Or I am. I had an extended wrestling match with MailerLite and although I didn’t break any equipment or murder anyone, I did threaten all of the above. Basically, all I was going to send was an announcement that my next novel was out, and I’ll do that right here, in this very spot, about a week from now. So you didn’t miss anything anyway. 

And to those of you who didn’t sign up, weren’t you clever? 

I don’t know why I thought setting up a newsletter was a good idea anyway. It’s something that the folks who seem to know things advise writers to do. I think the idea is that if you pop up in people’s inboxes, they won’t be able to get away from you until they’ve bought your book, but we all know that’s not true. They–or you, or we–can leave any time they/you/we want. 

Besides, here I am, popping up in your inbox anyway.

Mothering Sunday and Mother’s Day: a short history

Britain’s Mothering Sunday looks like the sister holiday to the U.S. Mother’s Day, but its roots (no surprise here) go back further and–I was going to say it’s a stranger story, but they’re both strange. 

Let’s start with Britain’s holiday.

Mothering Sunday

This started out as a church event that some date back to the 16th century and others trace to full-on medieval times. It had nothing to do with honoring mothers. On the fourth Sunday of Lent (March 27 this year), people went to the main church or cathedral near where they lived, which was called their mother church and which had a special service that day. The rest of the year, they went to their nearest church–a daughter church. 

You’re right: Hierarchy was built into everything.

One theory of the tradition’s origins is that it grew out of a Bible passage that was assigned as the reading for that day. (Apparently, the Church had assigned readings for Sundays and holidays. Who knew?) It had to do with Jerusalem, “which is the mother of us all.” And since it’s all in the interpretation, you can get from there to the mother church in three easy steps. Or two if you’re good at the game.

Marginally relevant photo: spring flowers. Actually a little early for either Mother’s Day or Mothering Sunday.

The day took on the air of a holiday. One source says domestic servants (that may exclude other categories of underpaid underlings) were given the day off to “go a-mothering” and also to visit their families. That might include their flesh-and-blood mothers, although since having children was a hazardous occupation you couldn’t take it for granted.

Another source doesn’t limit the day off to domestic servants but includes apprentices and reminds us that children as young as ten left home to work away. In this telling, as they walked the country lanes on their way home they picked a few wildflowers as a gift. 

It’s a sweet image and, I suspect, based more on guesswork than documentation. But that in itself is guesswork. Don’t take it too seriously. 

Another source (the link’s somewhere below–don’t bother me when I’m working, sweetheart) says the mother church tradition was medieval and the tradition of visiting family didn’t start until the 16th century–and it had a practical reason: The holiday fell during what was known as the hungry gap, when the winter’s stores were running low or used up and the fields and hedgerows didn’t offer much to eat. So servants and apprentices might go home bringing food or money. 

Let’s hope they had some to bring.

Cake

Since it’s a law that you can’t have a holiday without food (even the holidays where you fast put a big emphasis on what you eat when the fast ends), Mothering Sunday is associated with a cake, called Simnel cake, which for some reason gets a capital S. It’s a fruit cake with two layers of almond paste and eleven layers of religious symbolism.

How’d they get away with cake when it was Lent and people weren’t supposed to eat anything tasty or fun? 

Aha! They did it by reading the small print. The rules of Lent were relaxed for this one day, and so the day was also known as Refreshment Sunday. And that too was linked to a Bible verse, the one about Jesus feeding a multitude with bread and fish. Not with a fruit cake with two layers of marzipan, but it’s all in the interpretation.

The day was also called Mid-Lent Sunday, in case that’s on the test.

A break in the tradition

All of that–with the possible exception of the cake–went out of fashion in the 20th century.

Enter Constance Adelaide Smith, who kicked off a revival, starting with her 1921 book, written under the pseudonym C. Penswick Smith and subtly titled The Revival of Mothering Sunday.

She called for a holiday to honor  many forms of motherhood–the mother church, Mother Earth, mothers of children, the mother of Jesus, and–well, I’m sure she could’ve gone on. And did. The tradition  already existed, she argued, but needed official recognition to kick it into high gear.

She did not say “high gear.”

The medieval idea of motherhood as she saw it–at least according to one source–was rugged and diverse. 

Rugged? Well, the British LIbrary’s blog illustrates this point with a medieval painting of Mary handing off the baby Jesus to an angel (“Here, you, do something useful and hold the kid”) so she can sit on the devil and do a spot of wrestling. While wearing a pristine, floor-length skirt. To the modern eye, it’s an odd picture–especially the freeze-frame wrestling match–but I’ll admit to liking it.

Sort of. But only for its oddity.

Diverse? The medieval holiday wasn’t about honoring your own particular mother but motherhood in many forms. Or at least in one of the forms Smith included in her list: the mother church.

Smith herself had no children, which may be relevant here.

Yet another source, though, mentions that the medieval holiday wasn’t the uplifting event she imagined. Among other things, parishes were likely to get into brawls over who’d go first in the processions.

These things are always neater in hindsight.

Smith had another reason to go back to the medieval period. She’d been inspired by the U.S. creation of Mother’s Day (1914, since you asked) but didn’t want it to displace British traditions.

According to historian Cordelia Moyse, “A lot of people felt that industrialisation and urbanisation were destroying British culture and community.” So Smith took the medieval tradition, knocked off the mud and manure, polished it up a bit, and presented it as home grown, deeply rooted, and coming from a time of greater harmony, when people knew their neighbors and got into fights in church processions.

The idea caught fire at the end of World War I–according to one source because of the country’s many losses in the war. That doesn’t entirely make sense–it was young men who died in the war, not mothers–but grief’s a funny thing and will pour itself into any container it finds.

By 1938–or so it was said–Mothering Sunday was celebrated in every parish in Britain and every country in the empire.

Mother’s Day

Now we shift to the United States, where we already know Mother’s Day became an official holiday in 1914.

How’d that happen? Well, kiddies, it started in the previous century (that’s the 19th; you’re welcome) in several smallish ways. Before the Civil War, Ann Reeves Jarvis helped start Mothers’ Day Work Clubs, which were to teach local women how to care for their children. Forgive the cynicism, but my guess is that local women had been bringing up children for generations–that’s why some were still available for Ann R. J. to teach–but never mind. I’m sure Ann R. J. knew how to do it better than they did.

Then in 1870, Julia Ward Howe (she wrote “The Battle Hymn of the Republic” and was a pacifist and abolitionist) wrote the “Mother’s Day Proclamation,” which called for mothers to unite and promote world peace. In 1873, she called for a Mother’s Peace Day. 

Juliet Calhoun Blakely, a temperance activist, convinced Albion, Michigan, to celebrate a Mother’s Day in the 1870s.

All of that seemed to go nowhere, as these things so often do. Then in 1907, Anna Jarvis held a memorial service for her mother, Ann R. J. Who was dead at the time. That doesn’t seem entirely relevant, but see above about grief.

In 1908, Jarvis got a Philadelphia department store owner, John Wanamaker, to back a Mother’s Day celebration at a West Virginia church and, ever so coincidentally, to hold a Mother’s Day event at his stores. 

From there she campaigned for the holiday to be added to the national calendar, organizing a letter writing campaign to newspapers and politicians. First towns and cities adopted the holiday, and then it became national. It falls on the second Sunday in May.

After that, it all went wrong. Her idea involved a single white carnation, a visit to Mom, and a church service, but the florists, candy companies, and greeting card companies saw dollar signs and the holiday became a money spinner. (My own mother called it Florist’s Day.)

Jarvis might’ve seen that coming but apparently didn’t. She was cagey enough to enlist both Wanamaker and the florist industry when she was campaigning for the holiday. 

By 1920, she was denouncing the day’s commercialization and urged people to stop buying Mother’s Day flowers, cards, and candy. Eventually, she was launching lawsuits against groups that used the name Mother’s Day. 

In 1948, she denounced the holiday completely and lobbied to have it taken off the U.S. holiday calendar.

It wasn’t.

The lawsuits ate through her money and she died broke. The floral and greetings card companies that she had campaigned against paid her bills.

If anyone’s campaigning to establish National Irony Day, her story’s a perfect fit.

And Father’s Day?

No insult to fathers intended here, but it’s easier to get sentimental about a group that’s ignored or treated badly the rest of the year. Then once a year, you show up with flowers and chocolate and, you know, that makes it all okay. 

Fathers, though? They just don’t have the same appeal. Although you can trace Father’s Day back to the middle ages too, if you want.

Of course you want. European Catholics celebrated Saint Joseph’s Day  on 19 March, and a tradition of celebrating fatherhood in general can be traced back to 1508–which doesn’t say that it began then, only that if it started earlier no one’s found the notes.

In 1966, the U.S. made it a national holiday. It’s also celebrated in the U.K. but not an official holiday.

What does freedom of the city mean?

Not long after Prince Andrew gave up on huffing and puffing until he blew down Virginia Giuffre’s house–in other words, after he settled her lawsuit out of court–the city of York rescinded an honor it had given him back when he looked a bit less sleazy than he does today: the freedom of the city.

This is significant because, um, why?

Well, it’s not, really. Or it is, but only if you take British traditions seriously, which I have some trouble doing but I’m sure Andy doesn’t. No one could run around dressed in those uniforms if they didn’t take it all seriously. 

Still, in the avalanche of bad publicity that’s fallen on him lately, York’s contribution is barely a pebble. But since it’s an intriguing pebble, let’s talk about what this freedom of the city business is.

Irrelevant photo: This was taken during either Storm Dudley or Eunice, although I’m damned if I remember which one. My partner swore they sounded like an aunt and uncle from Oklahoma–ones no one looked forward to seeing. All that white stuff? That’s foam. We had enough wind to whip the ocean into a meringue.

Starting at the beginning

Freedom of the city dates back to the middle ages, when lords were lords and serfs weren’t free and any sensible person would’ve told you this was the natural order of things. 

All that non-freedom is what made the freedom of the city matter.

According to a “purported law” of William the Conqueror’s–he’s the guy, remember, who won England as his very own plaything in 1066–“If serfs reside without challenge for a year and a day in our cities, or in our walled towns, or in our castles, from that day they will effectively be free men and forever free from their bonds of servitude.”

For a law that’s no more than purported, it seems to have had an impressive impact. It was repeated in various ways by various cities and rulers. Henry II gave Lincoln a charter saying, “Should anyone reside in my city of Lincoln for a year and a day without being claimed by any claimant, and he is contributing towards the customary dues of the city, and the citizens can prove (by the customary legal process of the city) that a claimant was present in England but made no claim upon him, thereafter he may remain in my city of Lincoln, undisturbed as before, as my citizen, without legal challenge.”

For claimant, you can substitute lord–someone with a feudal right to claim this person as, effectively, his property.

Elsewhere, you’ll find specific statements about a villein (that’s what you and I would call a serf) being freed of villeinage if he lives “undisturbed for a year and a day in any privileged town, to the point that he is accepted into its community (that is, gild) he is thereby freed from villeinage.”

Gild? That’s what we’d call a guild. Hold onto that word, because we’ll come back to it.

 

Consulting the grownups about this

Notice that bit about privileged towns. This year-and-a-day stuff didn’t work in just any town. You couldn’t hide out for the required time in your local market town and hope to be free. The magic only worked if the spell was written into the town’s charter. 

But not every town or city was welcoming to fugitive serfs.

Do I have details about that? I do not. The best I can tell you is that historians aren’t in universal agreement over how common it was for villeins to free themselves this way, or how welcoming or unwelcoming towns were. And since historians are the grownups in this discussion, we’ll leave this for them to work out while we go upstairs and do whatever they told us not to.

It’s worth knowing that free men didn’t live only in cities. They also lived in the countryside, working the land more or less as serfs did. The difference was that they rented their land, didn’t owe the lord any service in kind, and were free to leave, although they couldn’t necessarily afford to. You could be free and as poor as the neighboring serf–or poorer. 

Nothing’s ever simple, is it?

 

Two footnotes 

  1. Becoming a free man didn’t make you a freeman. That was a different category and we’ll get to it in a minute. What being a free man did do was make you not-a-serf, which was a major change in status,even if it wasn’t the solution to all your problems. 
  2. Almost everything I’ve found talks about free men. Only the Guild of Freemen of the City of London website acknowledges references to women having been guild members. Given the English language’s counterproductive tradition of sometimes insisting that men means both men and women and the rest of the time insisting that men means only men, figuring out what we’re talking about here isn’t easy, but the year-and-a-day thing does seem to have applied to women. As far as I can tell.

 

Guilds, freemen, and free men

It’s not just the men and women who are hard to tell apart. Several websites get woozy about the difference between free men and freemen. So when the city of Birmingham, by way of example, explains what freemen means, it’s hard to know if it applies to both free men and freemen.

Don’t you just love the English language?

What does the Brimingham website say? “The medieval term ‘freeman’ meant someone . . . who had the right to earn money and own their own land. People who were protected by the charter (rules) of their town or city were often ‘free’, hence the term ‘Freedom of the City.’ ”

Are you confused yet? 

Good. Then you’re following the discussion. You could live in a city and be free, but not be a freeman, and therefore (at least as time went by) not someone who had the freedom of the city. To become a freeman of a city or town, you had to be accepted by one of its guilds, and they limited their membership. If too many people have the right to practice as, say, goldsmiths, prices will drop.

The medieval guilds were powerful organizations, made up of merchants or craftspeople (who weren’t always men). They had a monopoly on their corner of the economy and regulated trade, standards,  apprenticeships, and prices. Each one protected its interests, and they often controlled city or town governments.  

If you couldn’t become a member–and unless you had connections, you probably couldn’t–you might well be free and a man, but you were stuck working as a laborer. You weren’t a freeman of the city.

 

More about freemen

The Portsmouth City Council website skips over free men and goes straight for freemen:The institution of freemen or burgesses dates from the early beginnings of municipal corporations in the twelfth and thirteenth centuries. Freemen or burgesses enjoyed considerable political privileges, being entitled to elect the officers of the corporation and its representatives in Parliament, although they were not necessarily resident in the borough of which they were burgesses or freemen.”

In this context, the corporation was the city government.

“In choosing freemen or burgesses, boroughs found it convenient to admit men of national importance who might be able to secure greater economic or political privileges for the area. Prominent local landowners with interests in a borough would reward their supporters by securing their admission as freemen or burgesses–between the sixteenth and early nineteenth centuries a very high proportion of the known burgesses in Portsmouth were not resident in the borough.”

In other words, freemen were a select group of a city’s residents (or, just to confuse the picture, non-residents). They were people with power and money. That held until 1835, when the Municipal Corporations Act established city councils. After that, they might very well still have held power, but they had to exercise it differently.

 

Can we confuse the issue a bit more?

Of course we can. Let’s go to Texas, where a couple of Freedom of the City certificates are sitting in the Ransom Center, which led the center to write about them.

One certificate was issued in London in 1776 to Michael Dancer at the end of his apprenticeship. It was big–2 feet by 5 inches–and came with a tube so Mick could roll it up and carry it around with him. The Ransom Center swears that people would have carried these the way we might carry a passport or driver’s license today, to prove identity and citizenship. 

I offer you a grain of salt to go with that explanation. They might well have needed the document for one thing and another–only people who’d been granted freedom of the city could exercise a trade within London’t city limits, and that held true until 1835–but I’d guess it was too important to cart around the streets every day like a driver’s license. 

The Ransom Center tells us that along with a freedom of the city certificate, London also presented its new members with “a book titled Rules for the Conduct of Life, which was intended to guide them in their life as freemen. While providing many basic laws and recommended codes of conduct, the book also outlined several interesting freedoms available only to freemen.  For example, the book notes freemen have the right to herd sheep over the London Bridge, go about the city with a drawn sword, and—if convicted of a capital offense—to be hung with a silken rope. Other ascribed privileges are said to include the right to be married in St. Paul’s cathedral, to be buried in the city, and to be drunk and disorderly without fear of arrest.”

I’m not exercised about where I get buried–I hope to be past caring by then–but that silken rope might make freedom of the city worth pursuing. 

 

What does being a freeman of the city get you today?

Not much. Let’s limit ourselves to London: You can’t drive sheep across London Bridge anymore. Capital punishment’s been abolished, so if you want to be hung with a silken rope you’ll have to make your own arrangements. I’m not sure what the law is on drawn swords, but I‘d recommend doing some research before you try it. Folks get twitchy about swords these days, no matter what certificate you’re carrying.

That makes the freedom of the city something you can put on your resume, if you have one, but that’s about it. It’s just a bit of English tradition that you’re welcome to take seriously if you can.

The End of Roman Britain: Instability and the Hoxne Hoard

Whatever shortages Britain’s facing due to Brexit and Covid, it hasn’t run short of archeology. The country entered this strange time of ours rich in buried history and since the stuff in question hasn’t gotten up and walked out of the ground, it’s still rich.

The tale I’m about to tell you comes from before Brexit, though, and before Covid. Never mind the logic of that. I needed an opening paragraph. 

 

The tale

Let’s begin in 1992 with a tenant farmer, Peter Whatling, losing his hammer. And since–well, you know how attached you can get to a hammer, he got hold of a friend, Eric Lawes, who’d taken up metal detecting when he retired, and out they went to the field where Whatling had been when his hammer wandered off.

Before either of them had time to get cold and go home for a nice cup of tea, Lawes picked up a strong signal and started to dig, but instead of the hammer he brought up shovelfuls of silver and gold coins. Lawes was an experienced enough detectorist by then to knew when stop digging. He contacted the police and the local archeological society. 

The next day, archeologists came and dug out the treasure with the earth still around it so they could move it, intact, to a lab and work out both its age and how it had been stored before it was buried. What Lawes had turned up was 60 pounds of silver and gold in the form of 15,234 (or 14,780; take your pick) Roman coins and what’s technically known as a shitload of fancy thingies of one sort and another.

Lawes got £1.75 million for the find, which he split with Whatling, although legally speaking he didn’t have to. 

Whatling also got his hammer back, and it’s now on display along with the older and more expensive stuff, which is called the Hoxne Hoard, after the village where it was found. And because the English language is insane, that’s pronounced Hoxon. 

Try not to think about it. It won’t help.

The hoard is particularly valuable not just for what it contains but because it was excavated whole instead of being scattered by a plow or an over-eager detectorist. 

Irrelevant photo: Once again, I’m not sure what these are. Let’s just call them some of the many red berries that cheer us through the fall and winter.

 

Why people bury treasure

Every time someone digs up a pile of treasure, someone else asks what it was doing in the ground to start with, and it’s a good question. Who buries these things, and when and why? 

In the case of the Hoxne Hoard, the who is easy to answer (sort of), because some spoons included in that shitload of fancy thingies had a name engraved on them: Aurelius Ursincinus. That can give us the illusion that we’ve answered one of the questions, although we haven’t, really. We know he was male and that he had a Latin name. After that, the record’s blank. We don’t even know for sure that he was alive when the hoard was buried.

As for when, the coins give us something more solid to work with: The newest ones were minted between 407 and 408 C.E. So logically speaking, they’d have been buried sometime after that. 

Why someone buried them, though, draws us into the land of speculation, which is a nice place to visit but it’s always foggy, so it’s hard to be sure of what we’re seeing. What we do know is that some clever devil thought to make a graph of all the dates of the treasure hoards in British Isles and found spikes in three time periods: when the Roman legions left Britain, when the Normans invaded, and when England divided up into two teams and fought a civil war. 

In other words, people bury treasure in troubled times, hoping they’ll be around to dig it back up when the danger’s passed. The ones we know about? Those people didn’t come back. The ones we don’t find and that no one will? Someone came back for those.

 

Roman Britain

I’ve read about the Roman legions leaving Britain and always kind of assumed they got a telegram from Rome: “Troops withdrawn Stop. Expect you home soonest Stop.”

Well of course they used telegrams. They didn’t have email yet. The problem is that you paid for telegrams by the word. Or maybe it was by the letter. Either way, no legionnaire would expect an explanation–it would’ve been too expensive. So off the legions toddled, leaving Britain to fend for itself.

Which goes to show what I know. It turns out that they didn’t all pack up and leave at once. But as we usually do around here, let’s take a step back before we go forward: 

In the mid-fourth century Britain was being raided by an assortment of barbarians–a word I use under protest and only because I don’t have a better one. We attach all sorts of judgments to it, thinking it describes people who are hairy and unwashed and brutal. Also uncivilized, as if civilization was a guarantee of good behavior. But all it means here is that they weren’t Roman. 

Mind you, they might also have been unwashed and hairy and brutal, but except for the unwashed part, so were a lot of Romans. And I’m not convinced that modern well-washed brutality is an improvement, but that’s a whole different issue. 

Let’s go back to late Roman Britain: In the barbarian corner and raiding Britain, we’ve got Picts and Scots (with the Scots coming from Ireland, just to mess with our heads) and Attacots, who I’ve never heard of either. It doesn’t look like anyone knows who they were. Also the Saxons, who we recognize from other storybooks. 

Since the small print of Britain’s contract with Rome specified that Britons couldn’t be armed, the country relied on Roman power to protect it. Or at least the part of Britain that Rome had conquered did.They never did hold the whole thing.

In the midst of this, the more central parts of the Roman Empire had troubles of their own by then. Barbarian invasions. Uprisings. Emperors. The deaths of emperors. Battles over who was going to be emperor. 

In 383, in response to an uproar in the empire that we won’t go into, the Roman army in Britain revolted and named its leader, Magnus Maximus, emperor. He could only be the emperor of the west by then, since the east now had its own emperor, but hey, an emperor’s still an emperor, and the title was worth fighting for. So he–and presumably some sizable chunk of his army–invaded Gaul and killed enough people for him to actually be the emperor. Until he was killed, that is, which disqualified him forever after.

What happened to the soldiers who left Britain with him we don’t know. It seems to be a fair assumption that they didn’t go back, so color the Roman army in Britain depleted.

 

Emperors and clipped coins

After 402, the bulk importation of Roman coins into Britain ended, and from that point on the British started clipping coins–shearing bits off of them and using at least some of the metal to make new coins, which were local imitations of the imperial ones. Since the metal itself was what made coins valuable, this meant the coins were worth less and less.

A good 98% of the Hoxne coins had been clipped, with some of them having lost a third of their weight. If you’re trying to get back into your pre-Christmas wardrobe, you should know that this strategy doesn’t work for humans.

In the midst of all this, we can pretty safely assume that the army wasn’t happy, because soldiers don’t like it  when they’re paid in coins that aren’t worth what they used to be. Or when they’re not paid at all. In 406, a rebellion of Roman soldiers in Britain declared someone named Marcus as their emperor. Then he was deposed by someone named Gratian, who was replaced by someone named Constantine, at which point he and his followers toddled off to Gaul–that was in Europe and a far more central piece of the Roman Empire’s jigsaw puzzle–to see if they couldn’t really make him emperor. 

He was beheaded and once again there’s no record of what happened to his followers, but it couldn’t have been nice.

And that telegram still hadn’t arrived. That was the problem with telegrams back then. They had to be carried by guys in sandals. On foot. If you paid extra, they’d jump on a horse or they’d set sail, but it was still slow. And precarious.

 

Not-so-Roman Britain

Soon after Constantine and Co. left, in 408 or thereabouts, Saxons invaded, and sometime after that what was left of Britain’s Roman government faced a rebellion. The Britons armed themselves, ran off the barbarians, and then, for good measure, ran off the Roman magistrates and set up their own government. Or so said the historian Zosimus.

It sounds good, but according to the far more contemporary historian Marc Norris, it was a disaster. Britain’s links with the empire were cut and the archeological record shows a country rapidly moving backward. The economy and social structure collapsed, along with trade and distribution networks. Cities, towns, and villas were abandoned. Norris assumes widespread looting, along with a couple of synonyms–pillaging, robbing, that kind of thing. 

Archeologists can’t find much stuff left in the ground from this period. Good-quality pottery disappears, along with things like iron nails. Entire industries, they conclude, failed.

In the absence of a working government and army, the rich would have privatized security for as long as they could–and buried their wealth, because they couldn’t know when their privatized security squad will notice that it doesn’t actually need them, all it needed was their hoard of coins and expensive goodies. The person who hired them didn’t actually contribute anything.

Norris assumes that barbarian raids increased, although as he points out raiders don’t leave much in the way of hard archeological evidence, so we can’t know for certain. 

According to Bede, writing much later, the Britons of this period were “ignorant of the practice of warfare” after so long under Roman rule. Which is why, fatefully, their leaders seem to have made a deal with the Saxons to defend them from the Picts. Emphasis on seem to. History goes a little hazy during this stretch of time. But the going theory is that they swallowed the spider to catch the fly, and that’s how Anglo-Saxon England came to be: The spider did indeed eat the fly by inviting the Anglo-Saxons in, and that left Romano-Celtic Britain with a Saxon spider that wriggled and jiggled and jiggled insider ‘er.

*

In addition to the two links I’ve tucked in above, I’ve relied heavily on Marc Norris’s The Anglo-Saxons: A History of the Beginnings of England. It’s a highly readable and very useful book. I’ve lost track by now of who recommended Norris to me. Sorry, I have a note somewhere but I put it someplace safe and I’ll never see it again. So I apologize for not thanking you by name. But I really do appreciate the recommendation. Let me know who you are and I’ll include a link in my next post.

The north-south divide in English history

If you’re in the mood to break England into bite-size chunks, look no further than the handy north-south divide. It’s scored so deeply into the body of the country that you can treat the place like one of those candy bars you’re meant to share with a friend.

You want north or south? Choose carefully, because your fortune will rise or fall depending on which you take.

The north-south divide is not only recognized by Lord Google, it’s the organizing thesis of The Shortest History of England, by James Hawes, which I’ll be leaning on heavily here. Focusing a history so heavily on a single thesis damn near guarantees oversimplification, but it also gives the story coherence, which makes for a readable book. If you’re looking for a manageable, memorable history of England, this one works well.

And in favor of focusing on the north-south divide, it does tangle itself into England’s history, economics, culture, language, and geography, and it influences Britain’s politics to this date.

Irrelevant photo: St. John’s wort, or rose-of-sharon.

 

What am I talking about? 

The difference between richer southern England and the poorer north, although when we’re talking about southern England, what we really mean is the southeast, which is in turn heavily weighted toward London and the area that surrounds it. 

Where does the country divide? Draw a line along the River Trent, if you can find it, then extend it to the west coast. Next draw a line along the River Tamar to keep Cornwall out of the discussion and another one down the Welsh border to do the same for Wales. The part of Britain on the lower right is southern England. The part at the top is northern England until you get to Scotland, then it’s not England at all. 

I’d have told you to draw a line along the Scottish border, but it moved around over the centuries and I don’t want you starting any wars. 

Let’s trace the divide through a series of colonizers:

The Romans: The Romans held the island’s richest agricultural land, a.k.a. the south. The division may have been a factor before the Roman invasion, but the thing about people without a written language is that they don’t write, so the pre-Roman Britons didn’t leave us much in the way of detailed history. We’ll skip them.

The Anglo-Saxons: In the 8th century, the chronicler Bede, who may be more recognizable if I call him the Venerable Bede, mentions a division between the north Saxons and the south Saxons. I can’t do much more than nod at that, unfortunately, and acknowledge that the division struck him as worth mentioning. The difference could trace back to the island’s geography or to the Romanization of the south or to both. Or it could just seep out of the rocks. 

The Vikings: When the Vikings shifted from raiding to colonizing, the part of England they colonized was the north, both reinforcing the differences and adding layers of cultural and political spice to the sauce. 

The Normans: When Hawes asks why the Normans, with a small fighting force, were able to not just conquer but hold England, one of the reasons he cites is that the English couldn’t mobilize the whole country against them. There was resistance, but it wasn’t the sort of coordinated uprising that might have succeeded. And so the Normans made themselves lords of both northern and southern England, and they kept their own language, Norman French, which not only separated them from the conquered English but at least for a while united the conquerors. 

 

Language

What about the common people–the English? Some small segment of the Anglo-Saxon upper class became Normanized, and the key to that was adopting the French language. Below that level, commoners spoke English, but by the fourteenth century, northern and southern English speakers could barely understand each other. Hawes quotes John of Trevisa on the subject, and we’ll get to the quote in a minute, but first, John of Who? 

John of Trevisa, a contemporary of Chaucer’s and not to be confused with John of Travolta, although Lord Google would be happy to take you down that rabbit hole if you’re interested. The J of T we’re interested in came from Cornwall and was a native speaker of Cornish, but his legacy is a body of scholarly work in English–not in Cornish but more to the point not in Latin and not in French. Choosing English over those last two was a radical act.

Are we ready to go on? Let’s do the quote: “It seemeth a great wonder how English, that is the birth-tongue of English men, and their own language and tongue, is so diverse of sound in this island. . . . All the longage of the Northumbres, and specially at York, ys so sharp, slytting, and frotyng, and vynschape, that we southern men may that longage scarcely understonde.”

Please appreciate that comment, because it hospitalized my spell check program.  

The things I sacrifice for this blog.

Lord Google and I are at a loss over what vynschape means, and we’re not doing any better with frotyng, although for no clear reason I have the illusion that I could understand it if I’d just give it another moment’s thought.

The linguistic divide was still holding in 1490, when a northern merchant was becalmed off the Kent coast, in the south. He went ashore to buy supplies, asking in northern English for meat and eggs, “And the good wife answered that she could speak no French.”

Was the aristocracy as divided as the commoners? By the end of the fourteenth century, court life was shifting from French to English, so the power of French to unite the Normans might–and I’m speculating here–have been on the wane. Either way, heraldry divided the aristocracy into Norroy (the northern realm) and Surroy (the southern one), and the aristocratic families built alliances and power blocs based at least in part on geography.

 

Power

Hawes presents the War of the Roses as a particularly bloody outbreak of the north-south divide and sees Elizabeth I as consolidating the south’s rule over the country. One result of this consolidation was that the southern version of English became the dominant one. The first handbook for English-language writers, from 1589, advised writers not to use “the termes of Northern-men . . . nor in effect any speech used beyond the river of Trent.” (George Puttenham, The Art of English Poesie

England’s class structure did allow people to move up the ladder, but to do that they needed to speak southern English. Economic, cultural, and political power all wrapped around each other, and around language and geography. 

Let’s fast forward to James I of England, who was also James VI of Scotland, since after Liz’s death England imported him from Scotland in a desperate effort to keep England Protestant. This meant that, awkwardly, he was ruling two kingdoms, one stacked (at least on a map) on top of the other. He proposed to unite them and make himself the “King of Great Britaine.”

The English elite–for which you can read England’s southern elite–blocked the move. Parliament was by now a force in English politics and inviting Scotland to the party would’ve diluted southern power. 

From there we hit Fast Forward again and stop at the English Civil War, where Hawes sees the geographical divide still at work: The north was resisting rule from the south, and it was ready to make an alliance with the Celts–Cornwall and Wales (I’m leaving Scotland out of the discussion since it pops up on both sides of the war). In this reading, the king and Parliament, along with religious beliefs and demands for equality, aren’t incidental but they were being driven by underlying forces that generally go unacknowledged.

 

Union

When England and Scotland did finally become one country and Daniel Defoe traveled “the whole island of Great Britain,” he treated northern England and Scotland as more or less the same place. England, for him, was effectively the south. 

For a time, the Industrial Revolution changed the calculations. The south still had the richest agricultural land, but the north had coal, and it now fueled industries of all sorts. The northern elite got rich and northern cities got big. The drive to expand the vote was fueled in part by the northern elite’s drive to gain political power that would match to its economic strength. 

The north’s power lasted until finance outweighed manufacturing. 

Hawes talks about the country having two middle classes during at least part of the Industrial Revolution, one in the north and one in the south–and it’s worth mentioning here that the British middle class, especially at the time we’re talking about, sits higher up the social ladder than the American one. The southern middle class made its money in finance and commerce and the northern one in manufacturing. The southern middle class belonged to the Church of England and the northern one tended toward dissenting religions–and since that meant their children wouldn’t be accepted by the elite universities they started their own. 

By the 1850s, though, boarding schools for the middle class were opening. They were modeled on the elite boarding schools and their explicit purpose was to educate the sons of the northern elite to become like the sons of the southern. And it worked. Northern boys picked up the southern accent, learned what clothes would mark them as part of the in crowd, and played all the right sports. Basically, money and the fairy dust of southern culture allowed northerners to move upward. Not to the top rungs of the elite, of course–you had to be born into the right families for that–but to the bottom rungs of the upper rungs.

What the hell, upward is upward, and a lot of people were scrambling for those rungs.

Starting in the 1870s, the southern elite’s accent started to be called Received Pronunciation, or RP, and if you had any sort of ambitions, you damn well needed to sound like it was your natural accent. 

 

RP

In the 1920s, the BBC began broadcasting, and if you couldn’t reproduce RP convincingly, you weren’t one of its broadcasters . At roughly the same time, a report on teaching English in England insisted that all children should learn RP–as a foreign language if necessary.

RP was considered standard English and everything else was a dialect. And in case it’s not clear, dialect was bad. If you wanted to move up the ranks in the armed forces, you needed the right accent. If you wanted to be taken seriously in finance, in business, in education, you needed the right accent. Although as Hawes says, the ordinary English didn’t give a damn, they just wanted to sound like Americans. BBC English was no match for Hollywood films. 

 

Disunion

When Ireland became independent, the arithmetic of north-south power shifted. The Conservative Party’s base was southern England, and although it had opposed Irish independence, once Ireland left the party discovered that it was now easier for it to dominate the House of Commons. Reducing the number of MPs had made its southern base more powerful.

And if Scotland leaves the union–which the Conservatives oppose, at least publicly–they’re likely to find that Parliament becomes even easier to dominate–at least if they can hold onto their southern base. 

How a British town becomes a city

The English language plays tricks when it travels from one country to another, so if you asked me to define a city I’d have to ask you where the city is. Or where you are. 

Some days, I’d have to ask you where I am.

In the US, it’s fairly simple: A city’s a place where a lot of people live. How many? Um, yeah, no one’s drawn a clear line to separate it from a town.

In Britain, though, a town has to do more than get big to become a city. And in some cases, it doesn’t even have to get big.

 

The informal definition

Most people in Britain will tell you that a city has to have a cathedral, although one article I read claims a university will do just as well, and a few people think the town has to gather up a lot of people and convince them to live there.

But in Britain there’s a difference between people thinking of a place as a city and the place formally being one. To really be a city, the place needs the queen or king to wave a magic city-making feather over it.

Irrelevant photo: a begonia

Yes, really–except for that business with the magic feather. Because of course the queen or king has the final say over how many cities the country has. If they didn’t, for all we know every cluster of houses would dance around singing, “We’re a city. Look! We’re a city.” Order would break down. Trains would stop running. Long-established recipes would cease to work. 

Imagine Britain without its bakewell tarts and victoria sponges.* 

So yes, of course officialdom wants to put some limits on the number of cities.

Mind you, the king or queen doesn’t actually make the decisions about which town to citify. Officials do the choosing, but it’s the monarch who waves that feather, presumably while looking entirely serious about it.

Just to confuse the issue, though, any number of towns are governed by bodies that call themselves city councils. 

Why do they do that? Possibly because someone has delusions of grandeur and possibly because the language is at war with the country’s endless formalities. 

 

The formal process

Britain’s home to 66 officially recognized cities–50 in England, 6 in Scotland, 5 in Wales, and 5 in Northern Ireland. Not all of them have cathedrals. The belief that they had to comes from a time when building a cathedral really did make you a city. This led to small places like Truro being cities while much bigger industrial centers like Birmingham and Belfast weren’t.

In 1889, Birmingham became the first cathedral-less place to be recognized as a city, and these days you can leave all that stone in the ground and bid for city status through the Ministry of Housing and a Few Other Things. It’s less romantic than building a cathedral, but it’s cheaper and it’s easier on the fingernails.

There’s a catch, though: You can only apply on special occasions, when the Ministry opens up bidding to mark some occasion: the millennium, the golden jubilee, the silver jubilee, the arrival of a new kitten. Outside of those special times, towns have to shut up and wait.

What’s a jubilee? In dictionary terms, a celebration of anything from emancipation to becoming a king or queen, but in this context it has to do with Liz having become a queen some number of decades before. Or more accurately, the queen—something Britain as a whole takes seriously, even if not every single individual who lives here does.

 

How big does a city have to be?

Not always very. The U.K.’s smallest city is St David’s, which has a whopping 1,600  residents–not all that many more than the village I live in. It earned its status in 1995 to mark the queen’s 40th anniversary, and it was chosen because of its role in Christian heritage.

Yeah, the monarchy takes that Christian heritage flap seriously. It has to. If it didn’t, what’s to justify someone being the monarch instead of just one more citizen?

Part of the argument in its favor, though, was that it had a cathedral, so people already thought of it as a city. 

In practice, being big doesn’t guarantee official status as a city, and neither does being thought of as a city. London contains two cities–the City of London (called the City, as if the planet didn’t have any others) and the City of Westminster. But London itself itself isn’t, officially speaking, a city.

If you get dizzy, just sit down and rest a while. We’ll be here when you come back.

 

Mayors and cities

Most city councils (whether they govern cities or towns) will appoint a mayor, who does ceremonial stuff and shows up at special occasions in eye-catching and wildly outdated clothes, including gold chains that outdo anything a celebrity ever turned up in. If the queen (or king, as the case may be) has waved a different magic feather over the locality, the mayor may turn into a lord mayor. This will make no practical difference in his or her ability to climb stairs, lose weight, or push a car out of a snowbank. 

But having a lord mayor doesn’t make a place a city.

Sorry. Like I said, different magic feather, different result.

How do you address a lord mayor? You say, “Lord Mayor.” Or you say, “My Lord Mayor.” Or if appropriate, “Lady Mayoress,” or, “My Lady Mayoress.”

You do not laugh while you’re doing any of that upon pain of being banished from the event and left giggling hysterically on the sidewalk.

In a different category of officialdom, many towns and cities have an elected executive mayor, a title that sounds less impressive but comes with political powers, which ceremonial mayors lack. 

Having an executive mayor also doesn’t make a place into a city. 

 

Can a place stop being a city?

Yup. Rochester accidentally lost its status in 1988, when it reorganized its government structure and–well, you know how sometimes the cat jumps on the keyboard and your entire life disappears and next thing you know you no longer exist? It was like that. 

By way of demonstrating how important it is to have city status, four years rolled past before anyone noticed the city was no longer a city. 

It still hasn’t gotten its status back.

 

What are the benefits of being a city?

None, at least according to Professor John Beckett“There never have been any privileges. It’s always been a status thing, nothing more. There’s nothing to stop places declaring themselves a city–Dunfermline did it.”

The whole system, he says, “makes no sense” and just “gives a bit of patronage to government”.

Dunfermline declared itself a city in 1856. It figured that since it had been Scotland’s capital for 400 years, it had the right. The idea of it as a city never caught on, though, and it’s planning to bid for genuine city status when the queen’s platinum jubilee rolls around, in 2022.

*

* A victoria sponge isn’t something you wipe the kitchen counter with. It’s a cake-ish thing, as is a bakewell tart, although I’m stretching the definition of cake pretty thin in saying that.