Monday, June 10, 2013

Wordpress Security Vs. Wordpress Search Ranking

Wordpress security is sometimes at loggerheads with a site's search ranking. There are many tricks and tips recommended by security wonks that will actually decrease a site search ranking, such as banning all hits to xmlrpc.php, or disallowing various paths in robots.txt. I've experimented over the last several days and learned what works and what is counter-productive. I do not believe it is wise to ban hits to xmlrpc.php, and I do not think web admins should second-guess Google when it comes to directing robots. Google knows what it is doing, for the most part, and additional rules make Google angry, in a manner of speaking. I watched my site plummet from #2 in search rankings for a particular term to #5 after adding a lot of rules to robots.txt. Needless to say, I yanked those rules right out!

There is such a thing as having enough or even too much security. With regular backups of the database and the files, I am not inclined to follow all of the recommendations set forth by Perishable Press, one of the few sites I regularly follow. I view Perishable's advice in the way of guidelines and educational material. The author has a knack for explaining technical issues without resorting to jargon, with a humorous style reminiscent of Stephen King--the American vernacular, gotta love it--and he offers excellent examples on .htaccess. He is my "go-to" site when I am confused about arcane .htaccess syntax, which is often, because .htaccess syntax is unintuitive. I use some of his security tips, but not all, because some cause problems. I am also concerned that perhaps other problems may be created that I cannot detect, problems that may become evident in the future after I add a new plug-in or there's a new update to Wordpress.

Perishable's .htaccess code is sometimes compressed in a way that makes it difficult to debug or understand what is being done. Perhaps that is a form of showing off or maybe the intention is for the code to execute faster, but I'd prefer to sacrifice efficiency for readability and ease of maintenance.

I am no stranger to compressing code. I won a little contest back in the '80s, getting my name and program published in a national magazine. The challenge was to code a BASIC program that did something cool in only one line. Each BASIC statement could be separated by a colon (:), and GOTO 0 was allowed. But was this a useful or helpful skill? Maybe. This sort of experience may have helped me become a better maintainer of other people's spaghetti-code programs, which comprised a large portion of my career. I rarely had difficulty finding and fixing bugs.

I think Apache wrote the language for .htaccess back when every byte mattered, and in order to save a couple bytes, they made the language cryptic and anti-human. I much prefer languages such as COBOL, batch/script, or BASIC for their sheer readability. I never was a fan of C++, even if it is twice as fast. In my opinion, buy a faster computer, if you need speed. When programming languages are easier to understand and to code, then greater deeds may be wrought by human minds and with far fewer bugs. That's my philosophy about programming. I have indeed worked with extremely cryptic computer programming languages--assembler, no less. I am merely stating my own preference as a programmer and user. It's nice to be able to look at source code and figure out what is going on in just a few moments. Maybe my opinion does not dovetail with job security for those programmers already entrenched in cryptic languages, but it seems rather obvious to me.

No comments:

techlorebyigor is my personal journal for ideas & opinions