“Authentication by ‘Cognitive Footprint'”

This entry could almost have come from Geek Drivel:

[…] I remember reading a science fiction story about a computer worm that searched for people this way: going from computer to computer, trying to identify a specific individual.

I immediately thought of The Adolescence of P-1, one of my teenage favorites, and I was tickled to see that someone else had the same thought in the comments. 🙂

“MPAA Directly & Publicly Threatens Politicians Who Aren’t Corrupt Enough To Stay Bought”

Remember the SOPA drama last week? It has given MPAA not-quite-lobbyist Chris Dodd a bad case of foot-in-mouth: he publicly threatened politicians who’d taken MPAA money for not doing what the MPAA wanted. On national television, no less.

Un-freakin’-believable. And just this side of actually criminal. Dodd is a former senator — he should know that the money-for-votes relationship is illegal, and must remain an unspoken truth. Now he’s confirmed what everyone knew was the case, but had no evidence for: that SOPA/PIPA were written by the entertainment industry, and that that industry considers its donations to be bribes and expects the laws they want to be passed because of them.

(Hint, Mr. Dodd: the polite fiction is that campaign donations are no-strings-attached gifts, presumably because the politician in question has views compatible with the gift-giver and the gift-giver wants to help ensure that the politician is elected. Crossing that line is illegal. I shouldn’t have to be telling you this.)

If the MPAA doesn’t fire Dodd over that comment, they’ll prove they’re just as stupid and out-of-touch as he just shown himself to be.

EDIT, 2012-01-25: Hm, maybe this was actually criminal. Some people think so, anyway. I rather hope that this probe happens, though realistically I expect Obama and other politicians won’t dare to upset the MPAA any further by investigating them… unless, of course, enough people demand it. I also expect that, if such an investigation is started, it will be a sham, and everyone will know it.

“Ten 100-year predictions that came true”

Considering the poor accuracy of professional science fiction authors even in “near-future” SF, this guy‘s track record is nothing short of amazing. Too bad he can’t be around to enjoy his success, but when you’re making predictions for a century hence, that’s a bit problematic. 🙂

SOPA and PIPA stopped — for now

Wow.

I didn’t expect anything like what happened, and apparently neither did anyone else — including the MPAA. From the e-mail I received from FightForTheFuture.org:

The MPAA (the lobby for big movie studios which created these terrible bills) was shocked and seemingly humbled. “‘This was a whole new different game all of a sudden,’ MPAA Chairman and former Senator Chris Dodd told the New York Times. ‘[PIPA and SOPA were] considered by many to be a slam dunk…. This is altogether a new effect,’ Mr. Dodd said, comparing the online movement to the Arab Spring. He could not remember seeing ‘an effort that was moving with this degree of support change this dramatically’ in the last four decades, he added.”

Mr. Dodd is also claiming that we who oppose SOPA and PIPA are lying about it:

For the more traditional media industry, the moment was menacing. Supporters of the legislation accused the Web companies of willfully lying about the legislation’s flaws, stirring fear to protect ill-gotten profits from illegal Web sites.

Mr. Dodd said Internet companies might well change Washington, but not necessarily for the better with their ability to spread their message globally, without regulation or fact-checking.

“It’s a new day,” he added. “Brace yourselves.”

Citing two longtime liberal champions of the First Amendment, Senator Patrick Leahy and Representative John Conyers Jr. of Michigan, Mr. Dodd fumed, “No one can seriously believe Pat Leahy and John Conyers can be backing legislation to block free speech or break the Internet.”

Oh? I can seriously believe that they didn’t read the bill themselves, and don’t know anything about the infrastructure of the ‘net. Many senators and congresspeople don’t, probably most of them — and some are even proud of that ignorance.

The article goes on to say…

Mr. Smith, the House Republican author, said opposition Web sites were spreading “fear rather than fact.”

“When the opposition is based upon misinformation, I have confidence in the facts and confidence that the facts will ultimately prevail,” Mr. Smith said.

The facts, Mr. Smith? The MPAA and RIAA are their own worst enemies in this fight. The facts are that they’ve proven they’ll abuse any power they can get over the Internet. One of the best articulated descriptions of the problem that I saw is from user “wootah” on the Scott Adams blog. Unfortunately that blog doesn’t seem to offer permalinks to comments, but you can find this comment on this article if you dig deeply enough (it’s at the top of comment page three of four at the time of this writing). I haven’t followed all the links, but the cases he cites are common knowledge to those of us who’ve been watching. Please forgive the writer’s various abuses of spelling:

Jan 18, 2012
Scott, You are looking at it wrong.

Lets look at it entirely from a new view. When a dictatorship is caught abusing their legitimate powers, they will often make new laws to allow them to continue their behavior under the paradigm of legitimacy. Of course we know that this doesn’t stop them from coming up with new and creative ways to abuse their new powers. Ideally though, in this land of the just and free, a lobbying group would want the law to be so vague that just about any future abuse whether planned for or simply accidental will still fall within the law.

If you were opposed to the abuses of the original law, then you are going to be opposed to the new ‘privileges’ granted under the new law. So all we have to do is find examples of abuse under the current law and then compare them to the vagueness of the new law to get an idea of what we are up against.

Here is an example of the Existing DMCA law being abused by filing automated takedowns. They claimed that it was impossible to examine everything individually so they were just going to brute force screw everybody in a blanket automated takedowns: http://torrentfreak.com/warner-bros-admits-sending-hotfile-false-takedown-requests-111109/

Here is one where a Music blog was seized FOR YEAR. Because it was allegedly infringing… only to be later returned because nothing was illegal on that blog, and all music hosted was given WITH permission from all artists. No wrong doing was admitted: http://arstechnica.com/tech-policy/news/2011/12/ice-admits-months-long-seizure-of-music-blog-was-a-mistake.ars

The Music industry has ridiculously attached absurd numbers and penalties in an effort to create a new business model since theirs is failing. When they sued limewire, they asked for 75 Trillion (with a T) dollars in penalties… Apparently the infringers stole more value in music than there was money in all the markets of the entire planet. http://www.pcworld.com/article/223431/riaa_thinks_limewire_owes_75_trillion_in_damages.html

An example of the RIAA trying to close down Penn State Physics Department Computer infrustructure because of a similarly named song… Openly admitting they sent the takedown without bothering to listen to what they thought was infringing matierial: http://news.cnet.com/2100-1025_3-1001095.html

The people harboring the law want a double standard. When they were found atually pirating music that they didn’t own (by putting together ‘hits mixes’ and selling them without compensating the author,) they used their extensive group of lawyers to protect themselves from the type of fines they want to levy on other people.
Summary, Infringement based on their own numbers: 6 Billion
http://www.pirateparty.org.uk/blog/2009/dec/7/music-industry-faces-6-billion-copyright-infringe/
Result, Settling for 45 Million.
http://www.michaelgeist.ca/content/view/5563/125/

Conclusion:
Does the music industry need any more power to go after people? No they don’t. They want to put you in Jail for FIVE years for infringing. And their definition is so broad that me posting a 50 second video containing a small portion of another video distributed legally (and therefore already compensated once) would risk both you and me going to jail for 5 years AND the shut down of your sight.

Here you go Scott, Isn’t this little guy cute:
http://www.youtube.com/watch?v=kU9MuM4lP18

Note Due to posting this on the blackout day, some of the links will redirect to blackout. But they should work tomorr

I’ve heard of many more abuses by the RIAA and MPAA over the last ten years. Congressman Smith, after reading that, do you really think that Internet users would change their minds if they knew “the facts”? I suspect they’d form lynch mobs instead, with you as one of their targets.

Believe it or not, I sympathize with the MPAA’s and RIAA’s position. As the owner of a small software company, I see people steal my work all the time, and it drives me nuts. But I can’t support any law that essentially gives a handful of corporations the power to silence any site on the Internet essentially at will, and has such a chilling legal effect on any site that allows user comments. And that’s not even counting the consequences for the future… if they had succeeded in putting the infrastructure of censorship in place, how long do you think it would be before it was hijacked for other uses as well? People with power do everything they can to retain and increase that power, and a setup like that would be irresistible to them.

SOPA? Just stop-a.

“Footie club sacks striker for homophobic tweet”

The offending tweet:

I wouldn’t fancy the bed next to Gareth Thomas #padlockmyarse

Does this mean that the tweeter would climb into the bed of any female sleeping in the same room and have his way with her, even if she didn’t want it? Because that’s what it sounds like he’s saying. If not, then why would he think a gay guy would do something like that?

Granted, he’s probably avoided anyone with “teh gay” like the plague, like many sports-oriented “manly men” do, so has never really known one. Not knowing such a person doesn’t excuse snide and belittling comments though, to anyone.

Stop SOPA/PIPA!

As you may have noticed, this site — along with thousands of others — was blacked out today, to protest the SOPA and PIPA bills.

If you’re an American Internet user and haven’t heard of them before, what rock have you been hiding under? Go find out what it’s all about (this page might help), and when you’re suitably horrified, contact your congresscritters and give them a piece of your mind. Many of them apparently need one.

Linux compiles and massive IOWait delays

As mentioned previously, I use a 13″ mid-2009 MacBook Pro as my development machine, with virtual Linux and Windows machines running under Parallels. All was mostly well, except that I’m doing a lot more compiling in the last few months than I had been previously, and the IOWait problem on the Linux VM — always an irritant — had become ever more painful.

How painful? The first compile of a small C++ source file took roughly three minutes and forty-eight seconds, almost all of it spent waiting for the hard drive. Subsequent ones (if I’d logged in no more than a few hours ago) took only sixteen seconds. If I did anything between compiles — switched to a Firefox window to do some research, for instance — then the time for the next compile started climbing toward the initial mark pretty quickly. And if I left it logged in overnight, a compile of the same file never took less than a full minute, until I logged out and back in again.

I don’t know why the hard drive on this system seems so slow. I can’t even figure out if what I’m seeing is normal for this machine’s specifications (and I really hope it isn’t). If the hard drive were noticeably faster, the IOWait bottleneck wouldn’t be as much of a problem, but replacing it to find out isn’t a viable option at the moment, so it was time to look for alternative ways to improve it.

The first thing you always look at in such cases is giving the machine more memory. That was problematic here though: I need to run at least two virtual machines (Linux and Windows) almost full-time, often with a third (an older version of Windows) as well. The third virtual machine can get away with only a gigabyte of memory, but the Linux system requires at least two gigabytes with the workload I use it for, and the other Windows one nearly as much. The host machine is maxes out at 8GB, and it’s fully loaded. Gritting my teeth, I decided to sacrifice the third VM and bumped the Linux VM up to 2.5GB. There was no noticeable change.

(I’d already ensured that both the host machine and the Linux VM were running fully in memory, without swapping.)

The next avenue to explore was figuring out what the compiler was spending its time reading. If I knew that, maybe I could do something to streamline it. But all attempts at identifying that have failed — GCC’s bewildering array of debugging options doesn’t seem to include anything that provides that information, and though I’m sure there are programs to log exactly what files are being accessed when under Linux, I haven’t been able to locate them.

Okay, plan C: maybe there was some error or setting on the disk that was slowing down the reads? fsck gave the Linux virtual drives a clean bill of health, and using noatime made no noticeable difference either. Disk Utility on the host machine claimed there were many permissions problems, so I let it grind away at it for more than an hour until it was satisfied, but that produced no change. Scanning the host disk for any problem areas took a while longer, and was equally fruitless. I even tried resetting the PRAM, though I’m not sure what that is or does; no effect.

Plan D involved digging into all the information Google could provide on Linux file system speeds. Maybe an alternate file system would help? From everything I was able to find, the only one that might help significantly was ReiserFS, and only if the files the compiler was spending its time on were small ones. Experimenting with that felt like it would waste more time than I’d save by solving the problem (assuming it did, which wasn’t assured), so scratch that idea.

On to plan E (and some concern that I’d run out of letters before this was through): maybe there’s a cache setting to improve things?

Paydirt! Or rather, something slightly better than just dirt. 😉 After maybe a couple days of work on that, spread out over several weeks, I finally found one page a couple days ago that described the only option that made a significant difference: the cache-pressure setting. Essentially it tells the caching system whether to prefer to keep the contents of files in the cache, or the file-system information that lets it find files. The default setting is 100, which means keep both equally; a higher setting favors the contents of files, a lower one favors the file-system info. Some experimentation with it (using the command sudo sysctl -w vm.vfs_cache_pressure=XX, where XX is the number to set it to) showed that a setting of 10 kept the compile times and IOWait to a minimum — success!

Or was it? That worked well if I’d logged into the machine within the last few hours, but after it had been running for a while, compiles started slowing down again — to the point that, after leaving it running overnight, that file took more than a minute to compile, no matter how many times I tried it or how close together they were. Better that it was previously, but was there any way to improve it further?

What was happening overnight that could affect it that way? What did logging out and back in change that fixed it? The only answer seemed to be memory, again — as the VM ran longer, the memory in use grew, until it stabilised at between 600MB and 700MB (closer to 1GB if Firefox, with my current set of must-have extensions, were also running). That left a gigabyte for caching — surely that was sufficient for whatever GCC needed to look at? But there was no other difference I could find.

Maybe 2.5GB just wasn’t enough? I couldn’t imagine why that might be the case, but I bumped it up to a full 3GB.

It worked. The first compile after rebooting the machine still took the same amount of time, and subsequent compiles remained at about 16 seconds — but the next morning, after leaving it running overnight, compiles after the first one stayed at about 16 seconds. The IOWait was over! 😉

I’m not real happy about that solution. The machine is responsive, but it’s operating perilously close to its memory limit: there isn’t enough room left to sneeze in without forcing it to start swapping to disk. When it was running three VMs, I could always shut down the third one if I needed to free up some RAM; now that safety valve is gone. Even upgrading to more recent hardware wouldn’t help; the current crop of MacBook Pro machines also maxes out at 8GB.

I would really like to stay with a Mac, for the convenience of having all three major OSes available simultaneously. I must stay with a notebook system. I hope Apple’s next crop of MacBook Pro machines increases the memory limit, or I’ll have to look at non-Mac alternatives.

“US killer spy drone controls switch to Linux”

So you’ve got an unmanned flying drone with deadly weapons, controlled by ground stations that could be hundreds or thousands of miles away. Of course you run it with the most popular and least secure operating system on the planet! I mean, what could possibly go wrong?

I’ve said it before, in all-caps and bold italics: DO NOT USE AN INSECURE FREAKIN’ CONSUMER OS LIKE WINDOWS ON VITAL CONTROL SYSTEMS!

At least a few of the higher-ups in the military seem to have learned to listen to what their technical people have likely been saying for years. With any luck it’ll filter out to the private sector too, and sooner rather than later.


EDIT, three hours later: in an ironic twist, it turns out that today is the tenth anniversary of the Bill Gates’ “trustworthy computing” memo. While the change in focus has been welcome, it hasn’t really hardened Windows, just elminiated the most blatant insecurities. Windows remains basically a single-user consumer OS, and still tries to be consumer-friendly at the expense of security. So long as Microsoft refuses to require people to learn anything in order to use Windows, it will never be secure.

This may sound strange, but that isn’t necessarily problem. A consumer OS should be easy to use, and shouldn’t require the user to learn any more than he could pick up by sitting at the keyboard and playing with it. At the same time, such an OS should never be used for anything vital — leave it to what it’s good for, which doesn’t include anything that requires security.