The Deliverator – Wannabee

So open minded, my thoughts fell out…

Archive for the 'Tech Stuff' Category

Upgrades to Silverfir’s Xenserver hosting environment

Posted by Deliverator on 11th February 2015

It has been almost a year since my last post about Xenserver. Silverfir’s Xenserver environment has remained relatively static over that time. I ended up purchasing the Asrock Avoton C2750 motherboard mentioned in the previous post and using it as the basis for a Xenserver pool master / redundant server. I also topped up the Core 2 Duo machine with its max 8 GB of ram. The older Core 2 Duo machine actually turned out to be a better choice for primary host for the Silverfir VM, as the Avoton C2750 based machine had a tendency to lock up hard every 2 weeks or so. Plenty reliable for desktop purposes, but not for a computer intended as a low maintenance server. One nice thing about the Asrock motherboard is that it features an IPMI interface which allows one to perform basic management tasks like bouncing a box remotely, as well as IP KVM type stuff. The IPMI is really quite impressive, but is small consolation if the underlying platform it is designed to manage isn’t stable.

Recently, Citrix released a new version of Xenserver, version 6.5. I decided to upgrade all the cluster’s boxes to the new version. Installation was as simple as burning a CD, popping it in and doing an in place upgrade over the existing installation. While I was at it, I checked Asrock’s website and they had released a fresh bios for the motherboard as well as a new firmware for the IPMI. The release notes mentioned numerous stability improvements, so I figured what the hell and went for it. So far, the Asrock motherboard has been 100% rock solid since the upgrades. If this stability continues, I will switch to using the Avoton machine as the primary host and keep the Core 2 powered off and use it just as a cold backup. The Avoton should be much more power efficient.

Before doing the upgrades, I took the precautionary measure of making a full backup of the Silverfir VM using the excellent and inexpensive Xackup software. I’ve been backing up Silverfir semi-regularly over the last year, usually getting around to it every couple months. I’d like to backup Silverfir more often, but it was taking upwards of 22 hours to complete a backup at ~2MB/s. Also,  because I am a cheapskate and am using the free version of Xackup, each backup has to be triggered manually. Investigating potential causes, I found that the Atom 330 box I use to run the Xenserver’s Xencenter management software and Xackup, was just too low end to handle the task. Running Xackup was maxing the Atom 330’s CPU cores at 100% all throughout the backup, even at the lower compression level settings. Not applying compression resulted in much larger backup files while not significantly reducing completion time. I decided that the management computer needed an upgrade as well, but wanted to keep it low power and relatively inexpensive.

I ended up settling on a Asrock Q2900-ITX Mini ITX board with an integrated, passively cooled Intel J2900 CPU. This board comes with a CPU that is significantly faster than the Atom 330 and sips power, which is nice in a management server that has to be online all the time. I gutted the old Atom 330 system, plopped in the new motherboard and 8 GB of DDR3 ram and while I was at it put in a 3TB HDD to have more space for backups. After a fresh install of Win 7, Xencenter 6.5 and Xackup, I fired up the backup job, which previously took 22 hours. It now takes around 5 hours at maximum compression and the CPU cores stay below 70% the whole time. Mission accomplished.

Posted in Emulation and Virtualization, Tech Stuff | No Comments »

Dharma / Maya – A New Virtualization Platform for Silverfir to Replace Minimus

Posted by Deliverator on 22nd March 2014

Back in 2009, most of Silverfir.net’s services were migrated from an aging behemoth of a Compaq server named Frankenputin onto what was hoped would be a much more manageable platform which I christened Minimus. Minimus was designed to be a server that could run contentedly in a closet for year’s on end. It was based around a dual core atom motherboard which sipped power and featured completely passive cooling and used a solid state drive as its boot drive for greater reliability. The host OS was Windows 7 running VMware Server 1.x to host a Ubuntu Linux virtual machine. Eventually VMware stopped supporting the free VMware Server 1.x line and we were forced to upgrade to VMware Server 2.x, which featured a barely functional web based management interface.

Several months ago, Silverfir started experiencing unexplained lockups every few days that required the virtual machine to be rebooted. This became annoying (especially for a box-in-the-closet) and very inconvenient for both Ryan and I. Eventually, the VM failed to boot entirely and the chunk of Silverfir hosted by Minimus was down entirely. This coincided with an extended trip by me to Rarotonga, an island in the middle of the pacific ocean with very minimal internet access. While Ryan and I were jointly remotely investigating the causes of this misbehavior, we found several VM metadata files that were 0 bytes and backup copies of these files were zero bytes as well. I reinstalled VMware Server and Ryan recreated the VM definition files from scratch. During this process, the web interface was very difficult to work with, as it wasn’t working properly in modern versions of IE and Firefox. Eventually, Ryan was able to get everything working again, and took the opportunity to upgrade from Ubuntu 9.04 to the current Long Term Support edition of Ubuntu, 12.04. Unfortunately, this did not solve the problem with the VM locking up every couple days. I decided it was time to modernize Minimus.

After spending a week experimenting with a number of modern hardware assisted hypervisors, I eventually decided to use Xenserver 6.2 Xenserver is a free (in multiple senses of the word) minimal footprint hypervisor similar to VMware ESXi. Unlike VMWare Server, which required a full fledge host OS be installed, Xenserver’s host footprint is very minimal, leaving more of the system’s resources (especially ram) free to be allocated to guest virtual machines. Because Xenserver relies on hardware level support for virtualization (a cpu feature called VT-x), guest virtual machines run much closer to the “bare metal” and feel a lot snappier as a result. Xenserver also has support for another newer hardware virtualization feature called VT-d which allows for hardware devices to be directly shared with guest VMs. This allows for devices like GPUs to be directly accessible to virtual machines. Citrix likes to show off this feature by running demanding, modern games like Skyrim in a VM and playing the game through a thin client device like an Ipad. Neat, but not very relevant to our particular use case.

The main things that I liked about Xenserver were:

  1. Free in multiple senses of the word. Based around an open source project, Xen, with widespread adoption both within the OS community and among major commercial users such as Amazon. Xenserver stores its VM metadata in standardized formats that are directly importable into other virtualization environments. Because of this, I have some confidence that I am not going to be bitten by a product discontinuation or lack of an easy forward migration path as occurred with VMware Server.
  2. After a week of hammering at it, Xenserver feels very mature. I only experienced one bug, relating to migrating VMs with associated snapshots, in a week of testing oddball cases. The windows based management tool, called Citrix Xencenter, is a pleasure to use.
  3. Xenserver allows me to create redundancy “pools,” clusters of  Xenserver hosts and shared storage resources that allow for guest virtual machines to be moved back and forth between multiple physical servers without needing to be taken offline. It is VERY cool to have a server being run off one physical box one moment and 30 seconds later having it be running off a different server with less than a single second’s network downtime. This should allow for Silverfir to stay online while hardware maintenance is being performed, something that wasn’t possible under the previous VMware Server environment.
  4. Xenserver allows for easy snapshot backups of running VMs, allowing backups to be created while the server is in use. With VMware Server, the guest had to be shutdown to backup the virtual disks, a process which took hours even using an eSATA based external backup drive.

Migrating Minimus to Xenserver was a fairly straightforward process. I was able to import the primary Minimus VMware virtual disk .vmdk file directly into a newly created VM guest in Xenserver. I had to edit grub, fstab and network interfaces to get the VM working in Xenserver and also had to add an remove a few kernel modules and install the old VMware tools, but all told I probably spent less than 2 hours getting Minimus running happily under Xenserver. What wasn’t so painless was getting the second data volume .vmdk to import. This second .vmdk stored all of Silverfir’s websites, photo galleries, etc…you know…the things that people actually care about. I received errors trying to import this file using a wide variety of virtual disk management/manipulation tools. I think this vmdk file had been created in an earlier version of VMware and probably used an older version of the .vmdk format. Eventually, after almost a whole day of trying to import this file I threw up my arms in disgust. As a workaround, I put both the old Minimus virtual machine and the new Xenserver virtual machine, which I am calling Maya, on the same network segment and created a new virtual disk container in Xenserver. Ryan then copied the data from the old virtual server to the new using the magic of rsync. This took quite a while, as almost 500 GB of data needed to be copied over at 100 mbit speeds. After almost a full day of copying and some adjusting of permissions, Maya was substantially complete and took over hosting duties from Minimus. Maya has been happily hosting Silverfir without incident for over a week.

In the near future, I plan to decommission Minimus entirely and replace it entirely. Maya’s current primary Xenserver host is a Core 2 Duo with 7 GB of ram. I plan on using most of the guts of Minimus to create a new server based around an Intel Avoton C2750 motherboard. This new system, which I am calling Dharma will be the primary Xenserver host for Maya, with the current Core 2 Duo host serving as a high availability backup server. Hosting of Maya’s data volumes will be via a Readynas Ultra 6 with a 12 TB Raid 6 array. Hopefully this new setup will allow for greater reliability and fault tolerance than what was  achieved with Minimus and Maya can continue serving Silverfir’s users for years to come.

Posted in Blogging, Emulation and Virtualization, General, Linux, Operating Systems, Tech Stuff | No Comments »

Social media integration for long winded posts.

Posted by Deliverator on 18th November 2012

I’ve been forsaking the WordPress blog here for quite some time. Most of my ruminations seem to be too short for me to be bothered with writing a blog entry, so I’ve largely shifted to using my Twitter account. At the same time, I’m also finding Twitter’s 140 character limit a bit too limiting. I am often writing 3 or 4 back to back tweets on a subject, which I am sure does not endear me to followers uninterested in said subject.

There do exist 3rd party services like TwitLonger that work around Twitter’s forced brevity problem/feature, but I like to keep my data in-house to avoid many of the snafus that are part and parcel of using cloud services. Having lost or lost control of data important to me in the past, I don’t like trusting my content/making myself dependent on companies whose operational procedures are opaque to me and whose terms of service, business model, etc. might change with the blowing of the wind. It is one of the chief reasons I’ve yet to join Facebook, Google+, etc.

I am going to start testing various plugins for WordPress that allow me to automatically cross-post to Twitter as well as archive my tweets here in case Twitter’s business model becomes too onerous (the promoted tweets are already getting obnoxious).

Posted in Blogging, General, Mobile Blogging, Rants and Raves | No Comments »

The perpetually sucky state of non-destructive book scanning

Posted by Deliverator on 7th February 2012

Every few years I find myself in the unenviable position of unavoidably needing to non-destructively scan a book. Every few years I pray that someone has come up with an affordable, reasonably quick way of doing this that produces good quality results. Every few years I burn an evening researching the state of the field. Every few years I come away disappointed. Here are my observations from this go around:

Hardware:

Sheet Feeders – If you can afford to destroy the book, you can cut off the binding with a fine toothed band-saw or other power tool of your choice and feed the pages through a sheet-feeder style scanner. Sheet feeders like the popular Fujitsu ScanSnap series can scan both sides of each pages at something like 20 pages per minute at 600ish DPI. This is mighty impressive as it cuts the actual scanning time for a book down to something like a half hour. Unfortunately, when I find myself in the position of needing to scan a book, it is usually some rare tome it took me 2 years and $300 to find on AbeBooks. For my purposes, solutions requiring a band-saw need not apply. Also, many of the better scanners cost $400+, which is pushing what I would consider affordable.

Commercial Copy Stands range from simple single overhead camera rigs to more complex dual camera rigs with adjustable cradles to support the book without damaging it, re-positionable lighting, non-reflective glass to hold the pages flat, automatic page flippers, etc. Commercial, off-the-shelf solutions from companies like Atiz can run $14k+ without even factoring in the cost of cameras (typically high end Canon DSLRs). Great, if you are a university library spending grant money, sucky if you are a book nerd on a budget.

DIY Copy Stands – A substantial percentage of the functionality, speed and quality of these rigs can be replicated for under $1000 by building your own dual camera copy stand following one of the several increasingly standardized designs from the DIY Book Scanner project. This is still more time and money than I want to spend and probably more space than I want to waste for a device I would only very seldom use. When a full, well documented/supported, single evening kit is available for under $300 plus the cost of cameras, I will probably be interested. The BookLiberator looked to commercially produce kits that would meet all my requirements, but efforts to produce the units fell apart after Ion Audio announced its similar sub-$200 BookSaver product at CES 2011. Ion has since VERY quietly pulled the plug on the BookSaver without ever selling any, but their initial product announcement was enough to send most small, independent efforts to produce a similar device scurrying for someplace small and dark.

Flatbed scanners are an inexpensive, mature, widely used technology which suck at scanning books in a wide variety of ways. First, most flatbeds tend to be optimized for high quality scans of things like photos, not for speed. Secondly, most scanners have a significant bezel around the scanning platen, meaning the only way to scan a book is to significantly bend/distort the spin in order to get the pages to lie flat against the glass. Even mashed against the glass, you usually get significant page distortion near the binding resulting in curving text and uneven illumination.

Several years ago I purchased a Plustek OpticBook 3600 plus, a flatbed scanner specially optimized for scanning books. The OpticBook has a very thin bezel along one edge of the platen which lets you open a book to a 90 degree angle and have one page flat against the glass while the other hangs freely over the side. This lets you produce an undistorted scan of a page without significantly bending the spine. The “DigiBook” software included with the scanner has an automatic page rotation feature, so that every other page gets rotated 180 degrees. This lets you scan a page, flip the book over to scan the opposite page and have everything automatically rotated the right way. There are giant over-sized buttons on the scanner that let you trigger a scan in B&W, greyscale or color. The actual scan takes 5-8 seconds, as the scanner is optimized for speed, rather than highest possible DPI.

The OpticBook concept is very nice in theory, but the implementation leaves something to be desired. Even with the scanner bezel as thin as it is, the scanning element doesn’t get close enough to the binding to scan most paperbacks. It works fine for hard covers like textbooks, where the content doesn’t start as close to the binding. The software is also very crash prone and the work-flow somewhat less than ideal, with the operator having to hit a “transfer” button in the software after each page to write the image out to disk, despite the over-sized buttons on the scanner itself. Anything that adds 5-10 extra seconds to the work-flow gets multiplied tremendously over a 500+ page book. These scanners are also very poorly sealed, with significant dust accumulating on the interior of the glass plate with no easy way to clean it short of disassembly. There doesn’t appear to be a way to adjust the lamp brightness, so you tend to get a bit of bleed through from text on the other side of pages you are scanning. Many users also complain of short bulb life, although my unit is still functional. From reviews I’ve read, I am not convinced that Plustek has learned much from their mistakes in successive models in this series.

Handheld Scanners – I’ve never been impressed with the quality of the results from hand-held “wand” scanners. I haven’t personally checked out any of these devices in years, as I’ve largely consigned the whole category into Sharper Image / SkyMall crap-gadget territory. If someone wants to tell me that X device can quickly and accurately scan a paperback, I may look into these in the future.

Software:

Post Processing – While hardware has seen little improvement since my last review, there have been some improvements on the software side of things. The DIY Book Scanner project has yielded a plethora of scripts, tools, etc. for packaging up scans into various digital book formats. Of these, I have found a tool called Scan Tailor to be the most polished, easy to install, and to use. Scan Tailor will take a directory filled with scanned images and will straighten, deskew, remove background and bleed through (to give you black text on a pure white background), set a constant page size/margin, etc. Scan Tailor will work almost completely auto-magically through each step of the process and if it does make a mistake, it is easy for the user to intervene and apply a manual correction. Scan Tailor cut my workflow from previous years of 6-8 programs and scripts (each with fussy dependencies on libraries and frameworks) down to 3. I still do some post processing of scans in irfanView and Scan Tailor doesn’t do the final bundling of images into PDFs, DJVU, etc. or do OCR, but other than that it is pretty much a one stop shop for post scan image processing.

Binding – Once you have a directory full of post processed images, what are you going to do with them? I am still using Presto Pagemanager 7.10 for assembling my post processed TIFF images into PDFs. It isn’t ideal in many ways, but has the virtue of not costing me anything more and working consistently, if in somewhat of a hurky-jerky liable to temporarily freeze Explorer kinda way. I played around with a half dozen PDF/DJVU binder scripts/programs recommended by the book scanning forums and basically concluded that the free options all royally sucked in one way or another, not the least of which is requiring me to install 5 different programming frameworks just to try them out. Scan Tailor is a lovely, consistent, unified application that is easy to install and use. The DIY Book Scanner community could really use something as well done for the binding stage of the process. As it is, one is left to fend with a gobbledygook of unmanageable python scripts, ruby scripts, feeding various Unix command line utilities and throwing an undocumented fit anytime it finds something not to its liking. The situation is marginally improved if you want to output DjVu files rather than PDFs, but only marginally.

OCR – So, you want to turn those post processed scans into a re-flowable format like .epub for easy reading on your ebook reader device? You are kidding me, right? OCR is one of those things that has been around since the dawn of scanning and despite a lot of protestations seems to have changed little. If you asked me about the state of OCR 5 or 10 years ago I would have told you there is Omnipage & Abbyy Fine Reader & everything else. Today, that still seems to be the state of the industry. I tried a half dozen of the everything else variety including OpenOCR (Cuneiform), VietOCR and TiffDjvuOCR. Most of the free solutions seem to use Tesseract, an open source OCR engine from Google. Across 3 books with straightforward, single column formatting and commonly used fonts, I found the free OCR packages basically good enough to create a rough keyword index for searching books, but nothing near the accuracy to create a readable, reflowable ebook without significant time spent correcting errors. I concluded I might actually be able to retype a book faster and more accurately than if I tried to correct all the strange and easily unspotted errors committed by OCR. I would be curious to try the commercial packages at some point, as a lot of book scanners seem to swear by recent versions of Abbyy Fine Reader, but I’m not really in the mood to spend $150+ to fart around with either of the commercial offerings.

Posted in Books, General, Rants and Raves, Tech Stuff | 2 Comments »

Ipad 2 – Easy Replacement

Posted by Deliverator on 4th June 2011

I’ve had a couple hardware problems with my Ipad 2 (which I purchased on the first day of availability). The first was that the battery life was never as good as my Ipad 1. With my Ipad 1, I could go almost a week without charging, so long as I was only using it for ebooks and light web browsing. With my Ipad 2, I found myself putting it on the charger every couple days. The other issue I had was my Ipad 2 had a leaky back-light. When reading ebooks on a black background, this was really apparent and annoying. When used for anything else, pretty much unnoticeable. However, I read a lot of books. Still, not quite enough of an issue for me to get up and do something about it.

Last week however, the battery situation worsened considerably. While reading a book (with no apps backgrounded), my Ipad 2 went from 80 someodd percent to 10% warning level in like and hour and a half. I put it on the charger and went to bed. In the morning, I found my Ipad 2 not only hadn’t charged, but was dead as a rock. When put on the charger, I could get an apple logo to show up and a “hook up to itunes” restore message, but it would turn off and not power on under battery as soon as it was unplugged.

I ended up driving down to the Apple Store in Bellevue Square. The place was a complete zoo, but Apple seemed to have enough employees on hand to keep everything moving along. I proceeded to the back of the store where the very busy “Genius Bar” was located and I was intercepted before I could quite get to the counter by a guy asking me if I had an appointment. I said I didn’t and he pulled out his Ipad, asked me some basic questions about what was going on and created an appointment for me for 10 minutes later. In the meantime, I wandered the store looking at various accessories and doodads that I probably shouldn’t buy. A tech flagged me down, questioned me about what I was experiencing, plugged it into a testing device, verified the problem and hooked me up with a new, non-refurbished unit within another 15-20 minutes. I got home, plugged the replacement in, restored my most recent backup and had my replacement unit fully functional in hardly anytime at all. Granted, not everyone lives by an Apple Store, but I was very pleased by how quickly my problem was resolved. Oh, and the new unit did not have any back-light issues.

This Ipad 2 replacement process is a huge contrast to the long, uncertain chain of events that one must go through if a PC breaks down. Especially troubling is how manual (and thus difficult for a casual end user) the process is of migrating one’s programs, settings and data from one PC to another. The PC world has had decades to get this right and still gets it profoundly wrong. Quite simply, Apple with their Ipads and Iphones currently offers the easiest and most seamless old profile -> new device migration in the computing world.

Posted in iOS, Operating Systems, Rants and Raves, Tech Stuff | No Comments »

Hands on with Ipad 2

Posted by Deliverator on 17th March 2011

Well, I wound up getting myself an Ipad 2. I was able to sell my Ipad 1 via Gazelle for a substantial percentage of a new one, so the cost of upgrading was minimal. I ended up visiting 5 stores on the first day of sales (March 11th) before I finally found an Apple store at University Village with any left. I ended up getting a 16 GB model with Verizon 3g. As with the first Ipad, I purchased a 3g model not for the 3g connectivity but for the GPS functionality. I’ve been using the new Ipad 2 pretty heavily the last few days and thought I would share my thoughts:

-The Ipad 2 feels very different when held due to the curved edges and flat back. The flat back is really nice, as the first generation Ipad had a curved back and wouldn’t lay flat on a table and tended to want to scoot around when used. The curved edges are frankly quite annoying, as it makes it extremely difficult to plug in the main dock connector and the 1/8th” headphone port doesn’t fully mechanically support the headphone plug on all sides, making it difficult to determine if headphones are properly seated. This may also prevent the use of certain headphone designs.

-The Ipad 2 includes only a single speaker like the Ipad 1. It would have been nice if they had gone with stereo speakers, but that said, the speaker on the Ipad 2 is a LOT better. The original’s speaker was overly quiet and wasn’t much good for watching movies and the like.

-Upgrading from the Ipad 1 to Ipad 2 was super-easy. I ran a full backup of my first generation Ipad and then synced my Ipad 2. Itunes transfered all my apps and data. I only needed to manually restore settings for a few program, such as my twitter client, online banking client, etc. which don’t allow their data to be backed up. This had to be the easiest old computer->new computer migration I’ve ever done.

-I purchased an official Apple leather “smart cover” to go with my unit. I really like Apple’s minimal approach to providing screen protection. The smart cover adds only minimal thickness, provides scratch protection when you have the Ipad in a bag and folds up into a stand giving you two useful viewing angles. It also engages/disengages sleep mode on the unit. I expect this cover to be a good solution for most users. I doubt it will be the final solution for me, however. The Ipad 2, like the Ipad 1, is very slick, literally. There is no texture to the back surface, making it difficult to grasp one handed. I bought a Street Skin for my first unit and will likely do the same when they introduce an Ipad 2 version.

-The cameras are really crappy, especially for still shots. Grainy, low resolution, motion blur, poor light sensitivity are all words I would use in connection with these cameras. I really wish Apple hadn’t gone so low end in this area. The two upsides to the inclusion of cameras is that Apple has come out with a good video editing program for the Ipad for the first time and the camera works with most iOS apps designed for the Iphone. This has enabled me to deposit checks into my bank account without needing to visit my bank, for instance.

All in all, I really like the Ipad 2 over the Ipad 1, but it definitely has its faults.

Posted in iOS, Operating Systems, Tech Stuff | No Comments »

Inexpensive/Disposable Video Cameras

Posted by Deliverator on 16th March 2011

Five and a half years ago I started fooling around with “disposable” video cameras being sold through the CVS pharmacy chain. These video cameras were meant to be one time use equivalents of the cardboard box disposable still cameras still sold at many stores throughout the world. The idea was you would pay around $30 for the camera, go out and take some footage and then bring the camera back and they would give you a DVD with your video on it, but keep the camera. The pharmacy would then wipe your unit and sell it again to someone else. The CVS cameras were small, built robustly and powered by simple AA’s and inexpensive. Naturally, the hacker community went to work on the cameras and quickly figured out how to download the video without the pharmacy’s help, making them reusable. These were great cameras for use in places you wouldn’t want to risk a “real” camera. People attached them to model rockets, helicopters, planes, placed them next to hot things, explody things, etc. They were cheap enough that you wouldn’t think twice about risking the camera on the off chance of capturing some cool footage. Naturally, I bought half a dozen.

Over the years, I’ve attached them to robots, glued fisheye lenses on them, put them in zip-lock bags and used them underwater. I’ve captured some real fun footage because I was no longer risk adverse about risking the camera. In the process I’ve destroyed two cameras outright, permanently modified two for niche uses and one is good for only spare parts. Only two escaped my abuse entirely unscathed. Today, I threw them all away.

Why?

Quite simply, the magic economic equations surrounding gadgets + mass market demand + capitalism + time has rendered the old CVS cameras obsolete. For under $50 I can now buy a camera from Kodak that is quite a bit smaller, holds more video and at higher quality than the CVS cameras, and is mildly hardened for rugged and underwater use. If you shop around, you can get this camera for more like $40 at stores like Best Buy, but I just got mine at Amazon. There are similar form factor cameras from other makers, but most are significantly more expensive HD capable units that are designed more for people wanting a cheap, small, everyday camcorder or for technophobic people looking for a very easy to operate video camera. These units (Flip for example) tend to be more like $100.

Tomorrow, I am going to strap one onto a robot and watch things go crunch. If the camera survives, great! If it doesn’t, the camera’s Micro-SDHC card is small enough that I can find it intact in the twisted, shattered remains and I probably got some great footage for $50. Photography and videography is at its most interesting when people are willing to push boundaries and experiment. The technology has finally gotten cheap enough that “that would be really cool but I don’t want to break this expensive piece of equipment” is no longer part of the equation.

Some thoughts on the Kodak Mini Video Camera:

-Captures at 640×480 at 30 fps as an AVI file using an MJPEG video codec and 16 bit PCM audio at 11khz. At this setting you can fit about an hour’s video on the included 2 GB Micro-SDHC card. You can also do QVGA at 60hz and take stills as well. There doesn’t appear to be any image stabilization, but what can you (currently) expect from a camera that is under $50. Give it a few years though…

-The camera has a built in rechargeable battery. The unit has a pop out full sized type A USB connector that pops out of the side for charging. You will need to use a USB extension cable (not included) to plug it into a PC to charge. My unit did not show up as a USB mass storage device when I plugged it into a computer running the 64 bit version of Windows 7. Other users report it coming up as a drive letter and forcibly installing (without prompting) some piece of software called Arcsoft Mediaimpression SE which also seizes control of most video/photo file extensions. I was glad this was not the case with my unit.

-Because my unit doesn’t show up as a USB mass storage device, I had to pop the Micro-SDHC card out of the bottom of the unit. I had to use itty-bitty tweezers (thanks Tweezerman!) to grab onto the card as there is no ejection mechanism for the card. A 2 GB card was included with mine, but this camera is sometimes sold without a card.

-The camera is exceedingly easy to use, with just an on/off button, 4 way arrow buttons and center selector and a “settings menu” button. The simple control scheme should make this a good camera for micro-controller driven operation, if someone wants to strip it down to just the circuit board for use on a rocket, kite, balloon or something.

-The whole unit is smaller than a pack of cards.

-I am not sure if I would entirely trust the built in waterproofing on the camera. The only point of entry for water is through the base, which hinges open to reveal the USB connector and card slot and potentially around the membrane rubber buttons. The base does have some rubbery gasket material to seal against water, but it is pretty minimal. I would recommend coating the area with a thick grease/vaseline, etc. before submersion in water beyond a few feet.

 

Posted in Photography, Portable Computing/Gadgets | 1 Comment »

My take on Ipad 2

Posted by Deliverator on 2nd March 2011

I own an original model Ipad and use it daily. Today, Apple announced Ipad 2. Here are my thoughts on it and whether it is enough for me to upgrade:

Pluses:

-Ipad 2 is thinner and a little lighter than the original, while retaining the same general width x height and screen size of the original. The slight reduction in weight will be nice for those who use it as an book reader, as arm fatigue was a definite factor with the original.

-Ipad 2 has dual cameras. I’ve never seen video conferencing as much of a killer app, but I know some people that were really dieing for this with the Ipad 1.

-Ipad 2 has a new, faster dual core processor with what is being described as “9x” faster graphics. I am all for increased performance, but it is up in the air whether many app makers will write applications that make use of the faster subsystems for risk of alienating the large Ipad 1 user base.

-Magnetic screen cover system is a big plus in my view, as it lets you protect the Ipad’s screen when putting it in a bag or (in my case) large pocket, while adding little to the dimensions of the unit. Most cases for the Ipad 1 greatly increased the unit’s apparent bulk.

-3G models available for Verizon and not just AT&T.

-Has some extra motion sensing capability (3 axis gyro) compared to Ipad 1, which should be nice for gaming.

-One of the big pluses in my view is the new HDMI video output adapter, which works for ALL applications. This is a big change from Ipad 1 where applications had to be specially coded AND approved for TV output use. Think Hulu+, games, etc.

-Pricing is being kept competitive or slightly lower than similar Android devices

Negatives:

-No built in SD slot for downloading photos. This should have been do-able even with the thinner bezel of Ipad 2. The lack of a SD slot was a consistent minus cited by many Ipad 1 users/reviewers. I hate that Apple tries to make their devices aesthetically clutter free at the expense of needing to buy and carry a lot of easily lost adapters & dongles.

-No USB port. Nuff said. Wasn’t a big issue with me, but I know a lot of people wanted it.

-The Ipad 2 apparently still has only 256 MB of ram. I’ve bumped up against this consistently in my everyday use of the Ipad 1, which has the same amount, especially when doing tabbed browsing.

-Still requires an external power brick for charging versus being able to charge via USB on most computers, even if it takes significantly longer.

-Still no syncing over WiFi.

Externalities:

-Apple is beginning to enforce much harsher terms on 3rd parties wishing to supply content to Ipad users. They are essentially requiring any content being provided to users to also be available for purchase through their own content stores at the same price, so that they can get a (sizeable) cut of the pie. This will apparently apply even when the purchase is made “off site” and not as an in-app purchase. This will effectively make it impossible / not cost effective for competitors like Kindle, Nook and Sony to offer eBooks to Ipad users and will likely broadly apply to other types of content as well. I find this move to be incredibly anti-competitive and is a HUGE minus for me. One of the things which has made Ipad such a compelling part of my daily life is its ability to consume media from a variety of sources, whether that is news, books, music, podcasts or video. By constraining my choices to what Apple itself offers, they have greatly limited the appeal of the whole platform to me. If it wasn’t for this single thing, I would probably buy an Ipad 2. As is, if these changes take effect, I may sell my existing Ipad 1 in favor of an Android alternative.

Posted in General, iOS, Rants and Raves, Tech Stuff | No Comments »

Some thoughts on Western Digital My Book Essential 3TB USB 3.0 External Hard Drive

Posted by Deliverator on 25th February 2011

I recently found myself spending so much time juggling how I was storing my data in order to get it to fit on a combination of a 2 TB and a 1.5 TB external hard drive, that I thought it might be worthwhile to revisit getting a 3 TB external drive for backup. I decided against getting a 3 TB drive when they first came out, in part due to AnandTech’s unfavorable review of the only 3 TB drive on the market at the time, Seagate’s GoFlex Desk 3TB. The big turn offs for me were the poorly designed enclosure resulting in very high temperatures, high $/GB ratio and a host of compatibility issues. Since that time, both Western Digital and Hitachi have gotten in on the game as well with 3 TB offerings of their own. I opted for the My Book Essential 3TB, since it seemed to have the best designed enclosure of the bunch, offered the cheapest $/GB ratio of any of the 3 TB drives on the market at $165 via Newegg and gave me a chance to try out my USB 3.0 port on my Asus P6X58D motherboard.

Installation of this drive was decidedly NOT a breeze. I ended up having to update my motherboard’s bios, USB 3.0 controller sub-firmware and USB 3.0 drivers  just to get the drive to be recognized and then had to install and then update Western Digital’s included Smartware software in order to update the drive’s firmware in order to get it working properly. I wouldn’t recommend this to clients as a “just plug it in to gain 3 TB of storage” device, but once I got it working it has behaved like any other external hard disk drive and has stayed comfortably cool via strictly passive ventilation and worked reliably through multi Terabyte initial data copying and subsequent daily backups.

Western Digital doesn’t exactly go out of their way to advertise it, but this drive spins at something below 6000 RPM (hence the assorted eco-branding). Even with the fast USB 3.0 interface, this drive performs considerably below any 1.5 or 2 TB drive I’ve owned, even with those drives being in USB 2.0 enclosures. This drive is decidedly for bulk data storage purposes only.

One other thing Western Digital doesn’t advertise is that the drive used in the enclosure is the same WD30EZRS series drive which they sell for ~$35 MORE as a bare OEM drive sans enclosure. Popping the drive out of its enclosure is relatively straightforward, although you are likely to pop a couple plastic clips in the process, voiding your warranty. Still, if you are looking for a 3TB internal drive on the cheap and don’t mind potentially voiding your warranty coverage, this is about as cheap as you can get one.

I ended up picking up a second unit to use as an internal drive. I kept it in its enclosure long enough to update it to the most recent drive firmware and then popped it open. I am keeping the enclosure in case I ever need to apply another firmware update. It has functioned like any other non-boot drive in my system, save for that the performance characteristics are such that if you have more than a few apps contending for I/O attention from the drive, throughput drops enough that HD video streams start breaking up. This can be problematic if you are trying to watch a movie and a backup job starts in the background, for instance. To reiterate, this drive whether used externally or internally should be used for bulk data storage only.

Posted in General, Rants and Raves, Reviews, Storage, Tech Stuff | No Comments »

My take on Light Peak/Thunderbolt

Posted by Deliverator on 25th February 2011

With this week’s refresh of Apple’s Macbook Pro line of computers, consumers are going to get their first sampling of Intel’s Light Peak technology under the moniker “Thunderbolt.” Apple is no stranger to introducing new external interfaces, having premiered and acted as the die-hard champion of Firewire and Displayport. Both of these technologies, though offering technical advantages over other interfaces at their time of introduction, haven’t really become very mainstream and have remained pricier than alternatives. With USB 3.0 having beaten Thunderbolt to market by almost a year, I know a lot of techies have taken a brief look at Thunderbolt and dismissed it as yet another connector to try and fit on a motherboard bezel. I’ve looked at Thunderbolt in some depth and the deeper I’ve dug, the more I am interested. If widely adopted, I think it may widely reshape the collection of peripherals and mess of wires that have come to represent a “Desktop” level computing environment.

The salient points:

-Thunderbolt offers significantly more bandwidth than USB 3.0 with dual fully bi-directional 10 Gbps. That is up to 20 Gbps in both directions. USB 3.0 after overhead offers around 3.2 Gbps This greatly influences the classes of peripherals that could be run over a link. Think externalizing GPU’s vs external hard drives.

-Thunderbolt provides significantly more power to external devices than USB 3.0. USB 3.0 gives you a little under 5 watts to play with, which, while an improvement over USB 2.0’s ~2.5 watt, is less than half of Thunderbolt’s 10 watts. 10 watts is enough to power most full size desktop 3.5″ hard drives in external enclosures. It is enough to drive a monitor reasonably bright 20″ LCD monitor. With a little bit of power conserving design, it may be possible to do away with the need for power adapters for most present, common, PC peripherals except laser printers.

-Thunderbolt lets your daisy chain up to 7 devices. All the devices chained together have to share the Thunderbolt port’s overall bandwidth and power allotments, but both are fairly ample. The daisy chaining ability, combined with more directly powered peripherals, means a lot fewer cable will be needed to connect all your peripherals to your CPU unit and a lot of those cable runs will be shorter. In brief, way less desktop mess / tangle of cables.

-Thunderbolt tunnels the PCI Express protocol as well as Display port. Since tons of interface chips are designed to plug into PCI Express buses already, this will make it relatively trivial for 3rd party device manufacturers to take existing designs for internal peripherals and create “external peripheral” versions of the same. This, combined with much friendly licensing to implement compatible implementations and support of the underlying technology via Intel could make Thunderbolt a rapid starter, whereas some of the “inside baseball” aspects of Firewire lead to its slow adoption and lack of mainstream support compared USB 2.0.

Am I going to jump in headfirst and order a Macbook Pro today? No, but if Apple doesn’t try to play this one too close to its chest (and smother the baby in the process), Thunderbolt has the potential to truly become the “universal” bus that USB has long claimed to be.

 

Posted in General, Mac, Rants and Raves, Tech Stuff | No Comments »