I have purchased every consumer version of Windows, except Windows Me, since Windows 3.1. I have also purchased dozens of Office licenses, Server licenses and CALs. I will not buy another copy of Windows until the activation system is removed. Not another moment of my time will be wasted entering excessively long 100-digit activation keys into my telephone, only to have the key automatically rejected, then manually accepted after a few more minutes of inconvenience by someone on the phone. I have had enough.
I know how to pirate Windows. It is easy to find, and easy to do. A simple search on The Pirate Bay yields a torrent with 1,265 seeders for an activated Windows 7 Ultimate iso. It takes less time than to buy it, I don't have to deal with the broken Live login system, and the BitTorrent downloads are faster. I only know this because I had to download an ISO after one of my discs became unreadable, and Microsoft refused to replace it. Why can't I just enter my product key and download an updated ISO with slipstreamed service packs? Why do pirates have a better experience than customers? If I want to reinstall my system, or upgrade my hardware, or switch between bootcamp and virtual machines, I will. I am not calling to beg for permission any longer.
I put up with this broken activation system on Windows XP, where it was virtually impossible to not be forced to reinstall every couple of years, at the minimum. I've had to deal with this broken system too many times, and I will not any more. I will pirate the next version of Windows, and if I have to go through the hassle of pirating it to get a working copy, I will not be paying for it.
Why does Microsoft insist on making it harder for paying customers to use the product, than for pirates? Why alienate the people that actually pay for the software? Until this consumer hostile tomfoolery comes to an end, I am pirating Windows. Take the advice of Gabe Newell, and address the service problem that is causing piracy. Until that is done, I am not buying any more software.
Please do everyone a favor, and stop this. Piracy is not being prevented. No amount of legislation will remove piracy from the internet. Even if The Pirate Bay is shut down, there will be other ways. Pirates will just borrow the corporate edition from work, or download at link speed from Usenet. The only thing these activation schemes are doing, is inconveniencing those of us that pay for the product.
A little more than one year ago I purchased a 256GB Crucial SSD drive, the CT256M225, from Newegg for $639.
Setting the drive up initially was a snap. I installed Windows 7 on the drive, which automatically detected it and enabled TRIM support. Not really knowing much about SSD drives, I did some research into the optimal settings, and at every turn, found that Windows had automatically detected and enabled the correct setting. There was literally zero configuration required on my end.
Obviously, $639 at nearly $2.50/gigabyte is a rather steep price to pay, so my chief concern was reliability. After a year of 24-7 uptime and heavy usage seven days a week, logging 8,768 hours powered on and just 6 reboots, I can safely say that concerns about modern SSD reliability are unfounded. The excellent CrystalDiskInfo reports that the drive is still 67% healthy.
Everything I've learned about SSD's suggest that their failure modes are not catastrophic like spinning-disk drives, but that blocks will transition to read only. Given the time I've been running the drive, I suspect it will last another two years without serious issue. Economically, this means the drive will cost me just a bit over $200/year.
From a performance standpoint, transitioning to a SSD drive has yielded the single greatest performance improvement I've experienced for any hardware upgrade, ever. I think that the upgrade was worth every cent, and was happy with it immediately, though I was somewhat concerned that the drive would only last a year. Now knowing that it will almost certainly last three years, I feel that it was a bargain.
If you don't have a SSD yet, you should seriously consider one. I was looking to get a second CT256M225 for my development virtual machine, but found that it's since been discontinued. Fortunately, it's been replaced by the C300 series which sports 355MB/215MB read/write performance compared to the 250MB/200MB offered by the CT256M225, at a reduced price of about $475. That's $165 less than I paid a year ago, about $1.85/gigabyte. Crucial is also offering 128GB drives, and 64GB drives for about $240 and $140 respectively.
If this trend continues, in another year the price per gigabyte could be as low as $1.20, yielding a $300 256GB drive, or a 512GB drive for right around $600.
I've had a great experience running my entire system, operating system and software off the 256GB drive, with a 2TB spinning disk used for large file and virtual machine image storage. If you're budget constrained, I think it would be well worth the $140 to pick up a 64GB SSD for your operating system and critical applications. I think even this usage would provide substantial, noticeable performance improvements.
I began experiencing an issue where Windows Home Server was not initiating automatic backups of my Windows 7 Ultimate 64-bit Desktop. I believe that the issues were related to a UPS that was incorrectly reporting that the system was on battery power 100% of the time, even if the system was actually on utility power. To work around the problem, I setup a task in the Task Scheduler to manually initiate an automatic backup. The command to initiate a backup is:
"C:\Program Files\Windows Home Server\BackupEngine.exe" -a
You’ll need to launch the command from an elevated command prompt to get it to work. You can schedule it by opening the Task Scheduler, selecting “Create Basic task”, then following the prompts until you get to Program/Script. Here you enter “C:\Program Files\Windows Home Server\BackupEngine.exe” as the program, and -a as the arguments. Once you complete the remainder of the wizard, the properties for the task will open. From here, make sure “Run with highest privileges” is checked on the General Tab, and “Start the task only if the computer is on AC Power” is unchecked on the Conditions tab. You many also want to check “Run task as soon as possible after a scheduled start is missed” on the settings tab, which will ensure the task runs as soon as it can if there is a missed time.
This process will create a manual task that will initiate WHS automatic backups every day.
I ran into a strange problem recently, I couldn’t ping from or to a Windows XP box on my network. After some unsuccessful experimentation, I thought that the problem may be caused by the NVidia chipset, so in the NVIDIA control panel, I disabled TCP/IP Acceleration, this fixed the problem immediately.
It appears that there is some kind of bug that severely affects ICMP packets in the NVIDIA TCP/IP Hardware Acceleration, so I would recommend disabling it. This isn’t the first I’ve heard of the NVidia network configuration settings causing all sorts of strange issues.
Over the past day or so, I’ve been working on the absolute worst kind of tech problem, intermittent. Basically, some time ago I deployed another web server, and recently my monitoring service started notifying me that about once an hour it would be inaccessible from the internet for a few minutes. The strange part was, I couldn’t detect any problem accessing the site locally, and most of the time it was accessible from outside the network.
I initially suspected that the culprit was my Mongrel installation, so I got to work troubleshooting that. I spent over an hour working on mongrel before I realized that Mongrel probably wasn’t the problem. Indeed it wasn’t, my first mistake was not attacking the problem at the beginning. This server is deployed behind a Linux bridging firewall, and is physically a Sun Microsystems box running Xen. This particular server is virtualized on that box along with several other systems, and runs an Apache load balancer for the mongrel cluster. In other words, there are many layers where something could go wrong. The correct approach would have been to verify network connectivity internally and externally during an outage period with a tool like ping. After I realized that Mongrel was running just fine, that’s what I did.
After I enabled ICMP packets to the host in the firewalls, I was able to determine that nothing was getting to that host intermittently. I inspected the routing table of the router, and was able to get packets to flow again whenever a disruption started. Obviously I couldn’t manually reset the routing whenever the connection was lost, so I set out to find the real problem.
After examining my xen configuration files, I started to look into the mac addresses that the bridge was seeing. I quickly discovered that my servers mac address wasn’t listed because after periods of inactivity it would effectively time out. Upon closer inspection of the servers network configuration files, I was finally able to determine that while the server had static public IP addresses and a gateway set, it was also pulling a private IP and a gateway from the DHCP server. This interface was acting as the primary interface, which was causing the communications errors. Once I disabled DHCP and configured a static private IP address, I no longer experienced issues.
When dealing with complex environments and intermittent problems like this, it is important to isolate the problem and address it from a bottom up manner. If I had started at the base of the problem and checked the network connectivity to begin with instead of assuming Mongrel was at fault, I would have saved quite a bit of time. Additionally, when dealing with intermittent problems it is important to track down the source and either fix it outright, or force it to fail totally so that the issue can actually be addressed. Once I realized the issue was a routing problem somewhere along the chain, and was able to make it fail predictably, isolating the cause of the issue was much easier, because I no longer had to second guess if my actions were only making the problem worse.
If you follow these two rules when dealing with complex and intermittent issues, you’ll save yourself a great deal of time and trouble and ultimately come up with an exact solution. In other words, handle bugs in complex environments as orderly as possible, otherwise your actions may exacerbate the situation.
In the past couple of weeks I’ve ran into two different Smitfraud malware infections. It is a pain to remove, and has a tendency to show back up. In one instance, it kept manifesting itself with a red desktop background featuring a biohazard symbol and the text “Privacy in Danger”. Traditional spyware removal tools seemed pretty ineffective against this particular problem, but I’ve located the following tools that seem to be of some assistance:
When I ran these four tools in succession from Safe Mode on the systems in question, they cleared the malware problems up right away without necessitating a reinstall. I hope Smitfraud isn’t the future of malware, because it isn’t easy to get rid of.