Another Libertarian Argument Against Patents Bites the Dust
Libertarians and Austrians, including such organizations as the CATO Institute, Von Mises, and the Wall Street Journal, have put forth a number of arguments against patents and intellectual property. These arguments include that ideas (an invention is not just an idea, but I will let that go) are not scarce and therefore patents are not real property rights, patents are monopolies, patents inhibit the growth of technology, patents require the use of force to enforce one’s rights, patents are not natural rights and were not recognized as so by Locke and the founders, among other arguments. I have discussed most of these arguments earlier and will put the links in below. One of their favorite fall back arguments is that patents limit what I can do with my property. For instance, a patent for an airplane (Wright brothers) keeps me from using my own wood, mechanical linkages, engine, cloth, etc. and building an airplane with ailerons (and wing warping). This according to the libertarian argument is obviously absurd. After all it is my property.
Previous comments... You are currently on page 5.
Moreover - and to the point - the REAL invention of the Wright Brothers (which even they did not appreciate) was the PROPELLER. Everyone had always been mislead by ships because of Ericksson's screw propeller. But water is so much denser than air that no one perceived the propeller as a WING until the Wrights figured it out. Now THAT was a true invention, truly deserving of intellectual property rights.
Now consider another product of the individual--mental rather than physical. If people cannot own their ideas and what they can use them for to produce goods or services whose financial reward accrues to the creator, we are back to a hunter/gatherer, finder/keeper mentality, pre-property rights, and people will end up scrappling over an asset like chickens in a barnyard fighting over a worm.
The Constitution set up protections for inventors to guarantee them first rights to income from their creation. It's why a farmer is entitled to the harvest from the seeds he has planted and the plants he has tended (at least before Monsanto). Early tribal societies would give the prime portion from a kill to the hunter, who would share the animal with the rest of the tribe, an early case of the division of labor.
I don't have patents on my products, but have 35 years' worth of trademarks and copyrights. My products use mostly geometry, which is public domain, and even my trademarks acknowledge that a name applies to a certain product in a limited category and does not take the word out of the language universally and exclusively. For example, I own the trademark "Blockout", but not the word "Block" nor the word "out". And only in the classification to which it belongs. If you want to make a shoe and call it Blockout, have at it.
It seems that those who are first with a new idea own it. Like fungus, that singularity begets an entire population of related ideas, each building on what came before and adding some new element to it. Squabbles over ownership don't really arise unless profits are involved. Then everyone wants a piece of the action.
The most significant viral spread of ideas has been in software, open source, the Internet: give it away free to get everyone addicted, and then sell them ever-changing upgrades. How else would, in a mere 20 years, an entire planet become wired and the majority of people walking around with a cellphone permanently attached to their skull and tablets in their bags.
How would online communities, involving billions of people, have arisen like the old village square where folks would sit around and converse and decide the weighty questions of the day or play games. Millions of online games hook nearly everyone into global projects. What a fertile medium for the dissemination of ideas that people are willing to share not for profit but for the sake of spreading those memes.
TED talks started out with $4500 tickets for people to attend the live events, until they decided to put them online free for the world to see. "Ideas worth sharing", they call it. The payoff is not in money but in minds. Those are intangible goods. But when it comes to physical goods that inventors or their investors produce, then property rights do apply. The patent office examiners have the unenviable task of arbitrating what is an original idea and what is just derivative. I suspect that favoritism can creep in for ruling in favor of one but not another. Having a decent patent attorney helps.
I find the best protection for your intellectual property is to get it embodied and famous as soon as possible so imitators can't grab it, too. They'll jump in with spin-offs soon enough. Not that that stops agents like the Chinese from treating it as finders/keepers. I'm having to defend my domain names regularly against unauthorized borrowers.
I understand that 100,000 new websites are added to the Web every day. How's that for viral propagation? Lots of people out there are busy with new or adapted ideas, to the point where entropy sets in and only the search engines can keep trying to track them all. And then someone up on the meta-level will have a breakthrough idea for a better way to organize it, or monetize it. I wonder how many more lone entrepreneurs, the singularities like Gates and Jobs and Wales, will be coming along and succeeding, even with a barnyard full of chickens pecking at their feet.
I am curious. Which topic do you find great: patents as personal property or the quality of our exchanges here?
And there is no question that it takes discipline to curb one's enthusiasm about a new idea that can absolutely lead to code that isn't documented or that doesn't adhere to good design fundamentals. You make an excellent point there.
As for the transceiver example, it is hard to guess at how someone else makes value judgements without being there and considering the same inputs and outputs. I can either assume that it was a rational decision based on perceived profit margins, licensing costs, etc., or I can also just chalk it up to hubris -> irrational decision-making. ;)
ICs rarely do exactly what you want them to do either, but it doesn't make sense to build a custom ASIC for every application. I understand that it is often difficult to reuse code that does something similar, but I see that as the problem. I also do not see how it is rewarding to build, for example, a transceiver if one is on the market that meets your requirements, when you are in the cell phone business.
I also have a different take on your "wasteful" observation: most programmers I know enjoy the challenge of figuring something out for themselves. They generally eschew merely copying what someone else does because it isn't as personally satisfying. The other part that I have observed is that though problems may be similar, they are rarely exactly the same, which means that for any given situation, a minor adjustment is needed. It is really difficult to do that without understanding what you are tweaking, and often, writing your own both solves the problem AND customizes it to your specific needs.
As to your observation on code sharing, social media and forum sites have a cost and a purpose. Most people who use them get their payoff in recognition rather than in monetary remuneration, so it isn't as if they aren't receiving something of value for their contribution. I would add "without compensation" to your sentence "It also created a culture that using other people's work is acceptable" to complete it. It doesn't become theft until you refuse to "pay" for what you are getting. Value doesn't always have to be in money alone.
Your last paragraph doesn't describe any of the programmers I've ever worked with. Programmers are by their very nature some of the most logical people I've met. The place where things break down and you get the infamous "spaghetti logic" isn't usually due to the programmers, but because of the business rules they are attempting to mimic in code. What makes things a mess is because those business rules are rarely well codified, diagrammed, and proceduralized into something! When business managers are trained to think decisions through before jumping to conclusions, we will see better code as a result. Until then, we will continue to be frustrated by the disorder and haphazardness.
Don't get me wrong - I'm a big one for standards. I'm not advocating for disparate systems and inoperability - especially in communications protocols. That being said, however, I think it is way too simplistic to think that every problem can be solved with the same approach in the most effective, efficient manner. While the perfectionist in me still cringes at the idea of a solution that isn't 100% perfect, the realist in me is smart enough to reason with the business manager to say "for what you are willing to spend in resources, you will get a job that is 85% of what you want."
I do not contribute very much either. But I find far too many participants here making logical mistakes and ad hominem attacks. We all make mistakes. I call it a good day when I make less than hundred. But, people making personal attacks must be reminded that they are not allowed in polite society, even in vigorous debate.
Thank you, again for doing the reminding.
All the best!
Go away.
By poor vocabulary i mean that two high level applications in the same space may use completely different words to describe what they are doing. This makes it very difficult to compare the two. From a patent point of view it makes examining these applications very difficult. Compare this to the electronics industry and you will see a more common vocabulary. The closer the software is to the hardware the more consistent the vocabulary in my experience.
I say wasteful, because I have seen and heard programmers say that they will reprogram something they need (sorry about the lack of specifics, but they would all related to clients' inventions) rather than buy it. Now I know there are some practical reasons for this - failure to build software as logical structures, which was supposed to be the point of C, but this created a culture of repeating each others work. It also created a culture that using other people's work is acceptable.
Also, many in the programming business tend to write code first and then figure out where they are going. When you ask them to describe their code it is a bunch of overlapping circles with no structure, which I am sure makes it a nightmare to debug. This may have been great for job security, but it means that each piece of software stands alone. Imagine if the seven layer OSI model had not been adopted and everyone combined level seven apps with layer one or two apps. Then everyone has to program everything and there is no interoperability.
Inventors have a decision to make. If the invention is not detectable (difficult to reverse engineer) from the product, once it reaches the market, why would any sane inventor publicize the invention and alert competitors?
If there is no government protection offered, why would anyone spend the precious lifetime inventing, if the next day a competitor can copy with impunity? For public good? Hello, moochers.
Personally, I think that patents should last quite a bit longer. Why is copyright so much longer than a patent? I do not know.
Another example: RISC vs Intel's x86 instruction sets. Toss in Motorola (old Apple devices). Now you even have ARM. This is the foundational microcode that everything else rides upon, yet each solve the basic hardware control functions differently and with different strengths and weaknesses. Are you really trying to argue that we never should have tried anything other than x86?
What about the competition between Token Ring, IPX/SPX, and TCP/IP? I for one was very glad I never had to run vampire taps, though I still believe IPX/SPX is a better layer3/4 protocol than TCP/IP...
I would also point out that writing microcode is also different than writing assembly, which is different than writing Cobol, which is different than writing SQL. Each is a different abstraction layer further and further removing hardware controls from something the programming even has to worry about in the first place, so I have to cringe at your argument that "All software is a way of wiring an electronic circuit" as a gross oversimplification.
The whole reason to use software is to be able to adapt to new circumstances. Yes, you can completely build a device in hardware. It will be very, very good at doing exactly what it was designed for. And only that. You can't fix bugs. You can't upgrade. You can't install additional functionality or programming without software.
And if by a "poorly developed vocabulary" you're referring to the plethora of programming languages, again, you miss the point that each one was created with a specific specialty in mind. Fortran and Cobol were for science and business - lots of mathematics - when memory was limited. Basic was great at branching and conditional logic. Java focused on portability. C++ took object orientation to a whole new level. To say that any one of these is confusing, overlapping, etc. is to miss the entire point that they were tools to serve different purposes. (Although if you want to look at Lisp as confusing, I'll agree 100%. Talk about "bracketology" ...)
"Three, many computer geeks have no idea what software does."
This geek does. Software is what enables me to do whatever I need to do. Software allows me to adapt to the changing needs of my business with minimal cost and materials. Software allows me in a few minutes to do what it would take a design team years of effort to achieve via hardware. Software equals convenience which in turn equals productivity and adaptability, which equal profitability.
To ponder: Can you imagine trying to build a hardware-based database? If ever there were a better example of a need for software, I'm drawing a blank...
The problem began with computers. Computers are customizable tools of the utmost utility. Once they were brought onto the scene, patent law went haywire because of the literal virtualization of intellectual property. One no longer had to have a physical product to demonstrate ingenuity.
Thus the difficulty: when what you are building is a tool to fulfill a specific purpose but the tool is ephemeral, does it really constitute a "novel" and "real" invention for the purpose of a patent? I don't have an answer. It's just the crux of the whole situation.
Load more comments...