Welcome back.

Have you thought about subscribing? It's free.
seths.blog/subscribe

Simple thoughts about fair use

Copyright is not an absolute. Potato chips are absolute.

If this is my potato chip, then it's not yours. You can't touch it, eat it or use it for any reason whatsoever, not without asking first. Copyright doesn't work that way.

There is a yin to the yang of copyright protection, and it's called Fair Use. Fair use permits scholars to do their thing, permits those that would do parody or commentary or comparison to be heard. I'm not talking about taking someone's work to make it into a poster or some sort of endorsement–I'm talking about the need for us to be able to comment on each other's work.

Without fair use, it would be impossible to write a negative book review, or compare Shakespeare to the Simpsons. Without fair use, it becomes just about impossible to have a thoughtful discussion about anything that's been published since you were born.

Most web users should know a few simple guidelines, principles so simple that you can generally assume them to be rules. (Worth noting that whether you are in the right or not, a lawyer on retainer can still hassle you–not fair but true):

  • You don't need to ask someone's permission to include a link to their site.
  • You don't need to ask permission to include a screen shot of a website in a directory, comment on that site or parody it.
  • You can quote hundreds of words from a book (for an article or book or on your website) without worrying about it and you certainly don't need a signed release from the original author or publisher. Poems and songs are special exceptions. Then you can worry.

There's a difference between being polite and observing the law. If you quote something (an idea, a notion, a recipe), the right thing to do is give credit.

Photos are a real issue, unless you are clearly commenting on the photo (as opposed to using the photo to make a point that a different photo could make as easily). When in doubt, be the person who took the picture. (Aside: Compfight has an easy to use setting–do a search and hit "commercial" in the left hand column and voila–CC licensed photos, ready to go.)

PS as soon as you make something and fix it in a tangible form, you own the copyright in it. No requirement that you register it with anyone. Putting a © notice is certainly a helpful way to let people know you consider it yours, but the law makes it clear that merely writing your creation down confers copyright to you. And… "all rights reserved" doesn't mean anything any more, just fyi.

PPS Here's what happens when the lawyers go (way) too far.

I was wrong

In 1993, I saw the web coming. I was hired to write the cover story for a now defunct computer magazine about the internet, and dismissed the new Mosaic browser in a single paragraph.

I figured the web was just like Prodigy, but slower, harder to use and without a business model.

About as expensive a wrong analysis as a single entrepreneur with an email company could make in 1993.

The reason it was an insanely valuable lesson: I got better at announcing that I was wrong, learning from it and doing the next thing.

Politicians, of course, are terrible at this. They are never wrong, apparently, and when they are, spin instead of admitting it. Which not only hurts their trustworthiness, it prevents them from learning anything.

Two elements of successful leadership: a willingness to be wrong and an eagerness to admit it.

Joel and Clay don’t write often enough

Here's Joel Spolsky's latest post and project. Worth a read if you think about software, business models or how people use Excel.

And Clay Shirky's latest is about the future of newspapers. Again and again, he's right.

Any day when you can learn something new from either of these guys is a good day. Hard to imagine that just six years ago, knowledge like this was carefully hidden or non-existent.

Walking away from “real”

As in, "that's not a real football team, they don't play in Division 1" or "That stock isn't traded on a real exchange" or "Your degree isn't from a real school."

Real contains all sorts of normative assumptions and implicit criticisms for those that don't qualify. Real is just one way to reject the weird.

My problem with the search for the badge of real is that it trades your goals and your happiness for someone else's.

“How much are you going to tip?”

Have you ever been asked that question when splitting the check?

There are two couples at the table, the waiter has brought separate checks and the credit card holder turns to the other credit card holder and tries to find out how to coordinate the tip.

Why?

I mean, if they were out just the two of them, would they ask what people at the other table were going to tip? (Probably, hence the need to invent the 15% standard). Why does it matter if one couple tips 14% and the other 18%?

Don't underestimate just how badly many people want to fit in. (Not with everyone, just with their tribe and their peers).

What does it mean to be popular?

Some things (and people) are popular because a chord is struck, because there's a right place/right time alignment of interest and solution. But more often, an idea is popular because something had to be. Tribes demand winners, the flavor of the month, the safe choice.

That's why being the "presumed front-runner" is so vitally important. People want to hire the person or vote for the person or work with the organization that most people in their circle would have picked. They are then blameless.

We say we want to root for the underdog, but actually, we want to be seen as rooting for whomever everyone else is rooting for.

A significant part of marketing to strangers is the work of appearing to be the dominant choice, the safe choice, the one that's going to get picked by everyone else soon. Get in sync, the thinking goes, if you're the kind of person that wants to be in sync.

One example: a restaurant that highlights its most popular dishes virtually guarantees that those dishes will become the most popular.

There are a huge range of tools and signals available to marketers willing to invest in the position of most popular. But the signals are expensive. Because the presumed frontrunner can afford it. Hence the circular nature of marketing investment–acting like you can afford it often means you will soon be able to.

And the best plan for the insurgent brand? To find a smaller tribe, become the presumed winner there, and scale it up across tribes.

Direct response and the coarsening of culture

Direct response advertising to strangers is demanding. You pay for your click or you pay for your stamp and then you get a shot at making a sale. No sale, no revenue, no revenue, no more stamps.

As a result, direct marketers sometimes race to the bottom. They sell what sells the first time, and use the words that work right now. If the largest conversion rate is for a flat belly diet, then it's the flat belly diet that gets sold. The public gets what it wants.

And what does the mass public want? Shortcuts. Discounts. Claims. No room for subtlety or even innovation.

Yes, there are great products sold by direct marketing, but in most cases, those products were dreamed up and refined and beloved in a less measurable world.

In a world that was 90% retailers and pr and word of mouth, the direct response around the edges was no big deal. It brings us the Veg-o-matic and bald spot hairspray, but it doesn't really direct the culture.

Here's the thing: going forward, just about all the growth in marketing spend is happening on the direct response side. Google ads, email campaigns–these are measured in percentage points and in clicks. Without the tastemaking sensibilities of the buyer at Bloomingdale's or the quality guys at Fisher Price, the urge to compromise/shorten/cheapen/overpromise/dumb down is almost overwhelming.

It's already happening to TV and music. (The label doesn't have to please the music-loving program director. It has to please the YouTube clicking teen.) It's likely to happen to your industry soon as well.

People who have never sold advertising sometimes point out that a new form of advertising is better because it's more measurable, because it provides exact data instead of clumsy diary systems. Do you see that most advertisers don't actually want better data? If you're not sure what's working, you can't get blamed. And since you can't get blamed, you get to decide, to be creative, to create stories and fables, instead of merely being Mr. Ronco selling the bassomatic, at the mercy of anyone with a telephone.

Measurable isn't always the only thing that matters.

Trading in your pain

The pain of a lousy boss, of careless mistakes, of insufficient credit. The pain of instability, of bullying, of inadequate tools. The pain of poor cash flow, corrosive feedback and work that isn't worthy of you.

Pain is part of work. And it leads to two mistakes.

The notion that you can trade your way out of pain.

"If I just get a little bigger, a little more famous, a little richer–then the pain will go away."

This notion creates a cycle of dissatisfaction, an unwillingness to stick it out. There's always a pain-free gig right around the corner, so screw this, let's go try that.

The truth is that pain is everywhere, in every project and in every relationship and in every job. Wandering from one to another merely wastes your energy.

The other choice, though, is:

Embracing your current pain and avoiding newer, unknown pains.

This is precisely the opposite mistake. This leads to paralysis. Falling in love with the pain you've got as a way of avoiding unknown future pains gets you stuck, wasting your potential.

As usual, when confronted with two obvious choices, it's the third choice that pays.

On creating a hassle

To quote Merlin Mann, "You don't let the guy with the broom control how many elephants are in the parade."

Harsh to say, but the fact is that great storytellers and artists and ruckus makers manage to insulate themselves from the people they're going to hassle. And the job of those that are being hassled by the commotion is to be hassled by the commotion. No commotion, no job.

The artificiality of time

Until the transcontinental railroad, there were no time zones. Each village kept its own time, based on its own steeple and its own high noon. And why not? There was no good reason to go through the pain of coordinating the clocks.

Factory work forced us all to know exactly what time it was. The shift couldn't start until the foreman and the workers were ready to go. Synchronicity paid big dividends, so we embraced it.

This notion of lockstep started to inform all elements of our culture. Not just what time rush hour was (what a bizarre concept) but how old you should be to go to college and to get a job and to get married and to have kids and to retire.

The web is asynchronous. Time frames have accelerated (started/funded/built/sold!) at the same time they have slowed down. It's up to you to decide how long your time horizon is–perhaps you're willing to invest five years into building a solid reputation on a web platform. The decision to work at a different rate than others can be a significant competitive advantage.

Celebrate New Year's when you want to, and as often as you choose. They're your resolutions, not ours.

This site uses cookies.

Learn more