Thursday, May 31, 2012

Wild Rumors of the Mac Pros Demise

There has been wild speculation that Apple will axe the slow-selling Mac Pro line. Over 16,000 people have signed a Facebook petition. Wild and rampant speculation on the Internet suggest that Apple wants pros to use iMacs instead.



The argument goes like this: Apple has been making loads of money on iDevices and they have no time for slow selling devices. The Mac Pros do not sell in high volumes like the other products. Basically, the MacPros have taken a back seat to iPads, iPhones and Macbook Airs. On the surface level, this makes a lot of sense for people. But, I am here to say they are WRONG.

Before I go into my argument, I want to remind people of a few things: Apple needs content and app developers for their iOS devices. With Retina displays of the new iPad and upcoming Macbooks, you need beefy workstations that can power possibly 4K (4096x4096) monitors. Furthermore, why would Apple be touting Final Cut's  ability to do 4K video if all they have to sell are iMacs. Lastly, Apple's rumored acquisition of the Italian music editing startup Redmatica suggest they are still in the content creation business.

Now for my main argument.

The reason why Apple hasn't updated the Mac Pros in over two years is very simple. The fault lies squarely with Intel. Yes, it is all Intel's fault. Just like Motorola failing to provide updated G5 PowerPC processors in the past, Apple is at the mercy of their CPU supplier.

The Workstation Xeon class architecture has not been updated in over 2 years.  Up to 17 days ago, Intel did not have dual socket Sandy Bridge processors until May 14 of 2012. Don't believe me, go to Wikipedia and Intel's product pages:

http://en.wikipedia.org/wiki/List_of_Intel_Xeon_microprocessors#Xeon_E5_.28dual-processor.29
and
Intel® Xeon® Processor E5-2420 http://ark.intel.com/products/64617

So all you guys who are claiming you can build Hackintoshes better than what Apple has been giving up for the past 2 years need to give it a rest.

A Quad/Six-core i7 is not the same class as a multi-socket Xeon. The highest available memory you can get on a i7-3930K processor is 32GB of memory.  With the two year old Mac Pro architecture, you can go up to 64GB. The new E5-2420 Sandy Bridge Xeon can go up to  375GB of ram.

Now for folks who don't understand all this technical babble. I'll make it simple.

Xeon Class architecture allows system builders to build multi-socket systems. Meaning, you can have two CPUs on the same motherboard. You can have two six-core CPUs and your system will be seen as a 12-core processor. Simple isn't it?

For the past two years, Intel was slow on the boat. The consumer grade i7s could not be made into multi-cpu systems. Meaning, you couldn't go to Fry's Electronics and buy two i7 CPUs and put them into a motherboard.
To  do this, you needed a Xeon Class system- CPU, Motherboard and architecture. Furthermore, XEON class systems are more industrial. They often run ECC protected memory.
A Hackintosh i7 will not be the same as a MacPro or any XEON class workstations from companies like HP or Dell.

Sure,the benchmarks of the consumer grade Sandy Bridges have been smoking some of their XEON server based counterparts. I won't deny that. However, I cant see how Apple could have produced a single socket Mac Pro, price it at $2,000 and call it a workstation.

They could have made a new line called the Mac Pro Mini. A mini-ATX styled Mac Pro like in the same vein as the PowerMac 8100 was to the Powermac9100. It would be a smaller box with less expansion. Possibly a single socket, 4 banks of ram, 1 drive bay and 1 PCI slot. They could have sold it at $1500 and I would be one of the first buyers. However, I don't have access to their sales so I don't know if that would have been a wise move.

Am I happy about the situation? No. But people need to evaluate the facts first. Until there is final word from Cupertino. everything is pure speculation.











Wednesday, May 30, 2012

Whatever happened to true Docking Stations

Today, I won an ebay bid for a brand new Lenovo Thinkpad mini dock 3 and it brought back some memories. I am thinking about the convenience of docking at work and having my stuff all ready to go. No hassles of cables and clutter.

Unfortunately, modern day docking stations  are simply port-replicators for convenience. In the past, they are way much more.


Long before the turn of the millennium, you could buy a true desktop replacement docking station filled with full slots and expansion capabilities. I clearly remembered spending over $4,000 for an IBM Thinkpad 600e setup up with a Selectadock III. Dell and HP had a competing line of products for their workstation laptops. Apple had, well, they had the mini duo or duo dock.

The docking station of yore had full PCI slots and on-board SCSI. That was truly a beast and exemplified IBM's build quality. I remember adding all sorts of PCI cards like a dedicated MPEG2 decoder that can't even compete with modern smartphones. And remember the Adaptec 2940 Ultra-wide SCSI card? Yep, I had one of those running off a Thinkpad. I had "Scuuuuuzeeeee " server class drives.  I forgot, I also had a floppy bay Iomega Zip Drive. I could literally host a ISP back then.





I can't really complain about my new setup. A Lenovo port-replicator it is! I wonder how the Thinkpad brand would have flourished if IBM kept it.



However,today's laptops will do so much more. The T and W series Thinkpads have Expresscard slots and they can power dual/triple monitors up to 2560x1600 through displayport. Expansion is handle by USB and firewire.

The new Macbook Pros have Thunderbolt and you can get expensive Thunderbolt-to-PCIE bridge box enclosures. I've seen them set up with RED rocket video cards for working with 4K video.

Still, I can't fathom the possibilities if manufactures made full docking stations instead of port replicators. I also wonder what it would be like to have a modern day 600E/600X Thinkpad. Those things were built like tanks. 

I will post a review of the Lenovo "port replicator" and how it works with Linux in the coming weeks.

Long term HP TouchPad review


I was one of the lucky few to grab a few Touchpads during the infamous "Fire-Sale."
It has been 9 months  of ownership and how has does it hold up? I'm still undecided.

First of all, the hardware is clunky and un-reliable. I bought one for my dad and it was sent in FOUR times for service. During the post file sale period, HP did not issue replacements. Each time they went in for service, my dad was without his Touchpad for weeks. His Touchpad is stock webOS so his problems were strictly hardware. My 32GB model has been sent in once. There is already a crack along the speaker edge. Moreover, the device is rather heavy. I owned all the  generations of iPad and the Touchpad feels heavier than the 1st gen iPad.

At $99-$149, I really can't complain. There were times I was tempted to unload them on Craigslist or Ebay. I wish I did because I could have gotten $300 or so for a 32GB in the early days.

As luck would have it, there were always something that made me keep it.

I'll list a couple of cool things.

I could run basilisk 2 to emulate a mac classic "in-color."


I installed Ubuntu on it. Running mySQL Administrator, Python, full desktop Chromium/Firefox was cool. WebOS version of CHROOT doesn't require a goofy VNC server like on Android. WebOS's preware gives you a native X11 server.





Of course, you can install Android Ice Cream Sandwich via Cyanogenmod. Somethings don't work like the webcam and headphone jack but the Alpha 2 brought accelerated video playback. Netflix works great on this device.



I also run MAME on it inside my MAME icade cabinet.



Since I have four tablets in the household, I still don't have a good reason to keep it around. Except, it is pretty much now a worry-free disposable device. For example, when I work-out on the treadmill, I place the Touchpad without worry. I don't get paranoid if I drop or break it. Whenever all the iPads have been grabbed and accounted for, I could always pull the HP Touchpad out to use as a Netflix player. That is pretty much what it has been relegated to. My four year old son feels punished when I hand him the Touchpad due to the lack of games or entertainment options for Ice Cream Sandwich (ICS). In my family, I am the one who always ends up with the Touchpad.

Everyone else I know who got a Touchpad during the firesale is in the same position as me. Their devices have been socked away in some drawer; collecting dust.


So I guess I can't complain. They were pretty cheap to begin with. I could never imagine spending $500 for one. So now, MAME and Netflix are the reasons why I keep the Touchpad around.







Tuesday, May 29, 2012

How to debug AJAX

One of the things I do is mentor junior and young web developers. One of the most asked question is how to debug AJAX, XMLHttpRequests, also known as XHR.

Well, it is very easy. In most modern web browsers, there are web developer tools. There are native tools built in IE, Safari and Chrome. They are often called Web/Developer Inspectors. With Firefox, you can use Firebug.

Each browsers are different in how you access the XHR console. I won't go into details but it is very easy to find. Launch your web inspector or right-click and "inspect element" before you make your AJAX call. When the web developer tool (called by various names and differing across versions of the same browsers). Look for resources or Network. Then narrow down for scripts or XHR (short for XMLHttpRequest) options.



Below are screenshots of Chromium under Ubuntu 10.10 and Safari under Mac OSX.




Whenever I make an AJAX call, I look for the script/resource I am calling. In my screenshot above, I am both calling a script called ajax/_ajax_sample.php.

Here is the HTML I use to call my AJAX. It is a simple AJAX post, passing some variables in a form-like POST.

 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">  
 <html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">  
 <head>  
 <script src="//ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js" type="text/javascript"</script>  
 <script type="text/javascript">  
 $(document).ready(function() {  
      $.ajax({  
           type: "POST", url: "ajax/_ajax_sample.php", data: { action: 'update file', user_name : 'joe blow', user_id: 100 },  
           dataType: "json",  
           success: function(data){  
                if(data.status == 'success'){  
                     if (data.post_action == 'show_div') { $("#results_div").html(data.msg);     }  
                     } else {  
                          alert( "There was an error: " + data.msg);  
                     }   
                } // end of data.status = success  
           }); // end of ajax  
 });  
 </script>  
 <html><body>  
 Results:  
 <div id ="results_div" style="display:block;width:400px;background-color:#CCCCCC">Nothing Yet</div>  
 </body></html>  


If you go back and look at the two browser screen grabs, I am passing the following POST data. This is indicated in the "Header" tabs. The Header's tab is what you post to your resource. You will see other behind-the-scene info such as Request Headers, the method, and the Response Headers (how the server informs the browser it will process).

For our purposes, they only thing I am looking for is the "Form Data" of what I posted. I just need to know if what I posted was correct. They are:

 action:update file  
 user_name:joe blow  
 user_id:100  

My script or resource will takes the Form Data Post  from my AJAX call in the same manner you would be sending data through a HTML form or query string.

This is how you check you are posting the right data or not.

When your CGI resource or script gets the request, it will eventually return some data. The web inspector will also you give a read-out between the time it gets the request to the time it finishes. You can see how efficient your back-end code is.  See below for example of latency and processing time.



The results  are often HTML data or JSON. To see what it returns, simply click on the "Content" tab. There may be other tabs such as preview or JSON to show the same results.

As you can see in the next screenshot, my result was in JSON. If you go back and look at the same HTML file posted, I take the JSON and use it to execute another javascript function based on what I got back. Namely, filling the data into a DIV called results_div. The status return a 'success'.

 {"status":"success","msg":"your ajax worked mr\/mrs joe blow","post_action":"show_div","record_id":40}  







Now, in this next screenshot, I introduced some errors. My HTML page tried to make 2 ajax calls and got 404 errors which means the page could not find the two file resources.

The third error, I introduced some errors on my PHP script. The web inspector returned a 500 Internal Server Error. My AJAX is passing the correct data but there was something wrong with the PHP. 

Fortunately, I have a terminal window above to tail (monitor) my apache logs for PHP errors. By running tail -f  on my apache error_log, I can see the error is in line 6 of my PHP script. It is handy to have a big enough monitor to see and debug your web application.



This is the most common problem I see when a new web developer starts to use plugins, free lightboxes, modals, and unknown scripts. They simply see an animated AJAX spinning wheel and can't figure our why the page isn't do anything. By using their browser developer's inspection tool, they can see if they are not linking correctly or if simply their back-end is broken.

Clonezilla


So I am getting a new computer this week. As usual, I follow a strict methodology of preparation for new equipment.

Today's post will be about CloneZilla (http://clonezilla.org/).
In short, CloneZilla is a free Linux based disaster recovery imaging system like Norton Ghost.



Whenever I get a new computer running Windows or Linux, I run clonezilla. Before even booting it, I always clone the drive into a disk image. The drive may be 750GB but the OS install and apps may only take up 20GB. Using clonezilla, I save a factory fresh, ready-to-restore image to a portable drive or to a NAS server (via SFTP). In the event I want to sell an older computer, I can restore the drive to its factory original state.

After I install my applications and set up my system, I do another clone image in case of disaster, viruses, or anything unexpected.

Clonezilla can be installed on to a bootable USB stick or CD.

Usage is very straightforward. You pick a drive or partition you want to clone. You can clone to another drive or as image files. The image files can be an external drive or to a remote server volume. The image files are chunked into small pieces so they can fit on file-systems like FAT32.

Restoring is the opposite process. It interface is old school command-line menu. There is nothing to click and it isn't pretty but it works.

Instead of a barebone Clonezilla install, I suggest PartedMagic. PartedMagic is a complete bootable Linux distro with a bunch of tools which include Clonezilla, Gparted (for redoing partitions), Trucerypt, TestDisk (for recovering files off a crashed hard drive), and a bunch of other recovery applications.

In addition to physical machines, I routinely use CloneZilla along with Gpart to upsize or enlarge Virtual Machine images. For example, if I have a 8GB VM that I want to grow to 20GB, I use Clonezilla to clone and gpart to increase the partition size. Hence, I suggest using a distro like PartedMagic.

More info on PartedMagic can be found here: http://partedmagic.com/doku.php

On the mac, there is always Carbon Cloner. Macs have the beauty using of Target Disk Mode which is where a mac can be booted into a disk drive mode that another mac can mount and use. Hence, I never had to use anything like Clonezilla for imaging on a macintosh.
  

Monday, May 28, 2012

Goodbye Thinkpad X120e. A Long term review.




I just recently unloaded a trusty X120e on Craigslist. It served me well for over a year and it is now time for a newer Linux machine. A Thinkpad T420/T430 is now calling my name.

This post is going to be  a long-term review of the Thinkpad X120e.

Released in March 2011, the X120e, along with the HP DM1, were the first wave of AMD based Fusion ATOM killers. Unlike a typical netbook, these had better performing GPU (AMD calls them APU) and larger screen size.  Before the "ultrabook" craze, there were cheap netbooks and the Macbook Air. Since I already have a Macbook Pro, it made no sense for me to get another one. With a project that required Windows 7 and Kinect,  this machine landed on my lap. Since, I've been using this as a secondary/third computer.

I've been through various netbooks starting from the original Asus EEEPC 701, Acer Aspire One to a Dell Mini 9. The typical netbook resolution at 1024x600 was a god awful. So this was an improvement at 11.6" and 1366x768 resolution. Furthermore, it didn't feel as crippled and dog slow as the Atom based CPUs.

A bit bulkier than normal, it still was very light weight at 3.4 lbs.  Some notable feature include an SD slot, 3 USB ports,  and HDMI. Very much like any typical netbook except this was a Thinkpad. Some people wouldn't call it a real Thinkpad but to me, it was close enough The red nubby trackpoint and black plastic is what makes it a Thinkpad for me. I am a big fan of the Thinkpad no-nonsense spartan black business look.  Plus, there was the legendary Thinkpad reputation of their keyboards. The chicklet keyboard is one of the best I've used with decent amount of travel.

Battery life under Linux is about 5 hours. I hardly ever use Windows 7 but when I checked, the indicator always indicated 6.5 hours with aggressive power management.

As for upgrades, I added 8GB RAM and a 120GB solid state drive SSD. Despite the low-end CPU, the machine was rather snappy for my occasional use (database queries, shell scripting, and connecting to overseas VPN for downloading large files). I didn't play any games or watch any movies on it so I can't comment on multimedia capabilities. With the SSD, Ubuntu boots into login in about 15-20 seconds.

I had no problems with Ubuntu 10.10 and 12.04. Everything pretty much installed without a hitch. The microphone works, SD mounts, it goes to sleep, WIFI connects, HDMI works.  In other words, none of the typical Linux laptop nightmares. The only major complaint is the inability to power 1080p resolution in a dual display setup. It does not have enough processing power to run 1366x768 built in display and an external 1920x1080p at the same time. I could run 1024x768 along with a HD monitor. The other solution is just to run an external monitor and power down the built in LCD.

If you are looking for something medium budget, this or the newer 130e may be worth some consideration.

Mongodb GUIs


In my day job, I tend to keep up to date with relevant technologies and one of the latest web buzzwords today is noSQL. CouchDB, Cassandra, mongoDB, and many other noSQL alternatives have been gaining popularity with the young folks. Like usual, it is my job to keep abreast. I've been keeping myself entertained with mostly mongodb because I think it is one of the easiest and quickest to learn.

If you are looking to explore mongoDB, there are some great GUI tools to get you started.  Installing mongoDB is pretty trivial so I won't cover it here. 




On OSX, one of the best program I've come across is MongoHub. It is a very pretty and intuitive application. Within a few minutes, I was able to get some tangible progress in evaluating mongoDB. It has the ability to import MYSQL table schema into a mongoDB collection. I imported a working mySQL db I have been working on and I was able to quickly make JSON queries just from glancing at the mongodb reference. 

Instead of the normal “SELECT db_column FROM table WHERE db_col = value AND db_col2 =value2” , you use BSON like this:

 db.COLLECTION.find({'key': 'value'})

Since we are not using the console, there is no need to invoke the mongodb find command, it is all gui driven. For my imported DB collection, I simply typed  in my query expression like: {'City' : /^Con/i  }

Instead of playing with some “Hello World” tutorials, I already had a working set of data for evaluation within 5 minutes of installation.  There was no need for test records or test data, I had real working data   that I was already comfortable with. Another 20 minutes, I was able to write some PHP scripts to query and display records.

MongoHub can be found here:
http://mongohub.todayclose.com/


I haven't found anything in Linux comparable to MongoHub but these two solutions worked for me: PHPMoAdmin and JMongoBrowser.

Once you have the PHP mongo driver installed, you can run a web-based admin script. It works and you can create collections and schemas rather quickly. It is PHP based and there was no configuration or mucking around.

PhpMoAdmin:
http://www.phpmoadmin.com/




The other GUI is JmongoBrowser.  It is cross platform and Java based.  Again, it works but there is nothing to write home about.

JmongoBrowser:
http://edgytech.com/jmongobrowser/



Both phpMoAdmin and JmongoBrowser installed in Ubuntu 12.04 without issues. They also run on MacOSX.

Ramdisk vs SSD on Ubuntu

With the advent of fast SSDs capable of reading/writing 200-500 Mb a second, is there a need for ramdisks?  I decided to try it out in Ubuntu 12.04.

If you are wondering what a ramdisk is, it  is simply using your physical RAM memory as a temporary storage drive. Instead of writing to disk, you are writing files to memory.

And the results of my testing?

Well, I'll let these pictures speak for themselves:

1st. Ramdisk.  Average read 1.2Gb/s. As in Gigabytes per second. An entire full DVD movie worth of data would take less than 5 seconds to copy.


2nd,Corsair F120 Sandforce based SSD. Read speed bench at 234 Mb/s which is no slouch and faster than any platter drive.  The same DVD would take roughly 22 seconds.


Compared to a standard platter HDD drive, a DVD would take  81 seconds at 60Mb/sec. A 10Mb/sec USB stick would take  486 seconds or 8 minutes to copy.
  
I tried a Virtual Box VDI image in ramdisk and an ubuntu 10.10 image loaded in less than 6 seconds.

Here is how you make a ramdisk:

 mkdir -p /tmp/ramdisk  
 sudo mount -t tmpfs -o size=1024M tmpfs /tmp/ramdisk  


Or, you can simply copy files to /dev/shm/ but you risk saturating all your available ram. By using tmpfs, you can set a limit. In my example, my ramdisk is 1GB.

It should be noted that ramdisks are not persistent. They will need to be recreated upon reboot. You lose the data in ramdisk when your power down.

I am currently exploring options for a real-time ffmpeg transcoding system that will write and read quite a bit to disk. I also have another use case scenario with imagemagick/ghostscript writing large temp files of PDFs. A ramdisk may be the way to go.

There is another interesting use of ramdisk. To run a completely private and secure micro servers like tor-ramdisk to evade police authority. Data would simply disappear upon a power down. If the authority seized your equipment, all the data would simply vanish and make it harder  for forensic analysis.

Now, I just need a laptop with 32GB of RAM.


Sunday, May 27, 2012

Ubuntu 12.04

I finally made the switch to Ubuntu 12.04 from 10.0.4 on my personal machines. As for my professional needs, I am still using CentOS. For the past year, I avoided upgrading Ubuntu since they switched to Unity (aka Netbook Remix version 3).

However, as many of Linux users with a Galaxy Nexus know, MTP (Multimedia Transfer Protocol) does not play nice with Linux. In short, I can't connect my phone and transfer files from my Linux box.
ICS (Ice Cream Sandwich) MTP usb access was one of the rumored things about 12.04. This and this alone was the reason for me to cast my doubts on the new release of Ubuntu. So, I took the plunge and updated a few machines to 12.04.

Well, that didn't turn out to good either. You still have to muck around with Fuse, MTPFS, and play around with fstab entries. Not my idea of intuitive or fun. The MTP issue extends to other devices like the new Galaxy Tab 2. Don't get me wrong, I could 3 out 20 times get my phone to mount and maybe 1 out of 30 times have a non dropped connection during copies.

There were also rumours of exFAT support for cross-platform filesystem sharing. Again, that didn't turn out to be true either.

Still, I made the plunge and changed. I'm still not a fan of Unity so I run Ubuntu in classic GNOME mode.

Besides running classic GNOME, there are few things that are cool.  There is reliable CISCO ipsec and openvpn support. I no longer get dropped connections and could reliably be connected for 12 plus hours.

Another surprising thing for me was better iOS support. Even with iOS 5.1, Ubuntu 12.04 is able to allow me to mount and copy files to an iPad. I can copy movies and spreadsheets into my iPad 3 via drag-n-drop. Again, very surprising that Ubuntu has better iOS support than Ice Cream Sandwich. You figure that Android is Linux, it should play nice with other Linux devices.

Screen grabs are proof enough below.


Copying pdfs  is 10 times easier  to the iPad than to my Samsung Galaxy devices.


As you can see here, MTP File access is still problematic. In 2012, people shouldn't have to terminal detect USB devices, write fstab entries and manually mount devices in the console.


Other things I like are:
Airprint built in. I can print from my iPad using the UBUNTU as a host print Airprint server. This is default in the CUPS setup.

I dig the wanna-be OSX Time Machine style backup. I like the fact I can SFTP into a remote server and it backs up my files


Now, there are some problems that I haven't been able to sort out yet.  I was not able to install 12.04 of some older Dell PowerEdge 2850/2950 rack servers. I'm not alone.  Fellow co-wokers could not get it to install. I did not find a JEOS or shell only install options. The CD/USB took me straight to live session. I suppose there may be some special keyboard shortcut at boot or somethng else trivial but I never pursued it. 10.04 LTS will stick on my servers along with CentOS for the time being.

Overall, I am pleasantly surprise and not quite ready to write-off Ubuntu.

iPad 3 as a Cinema Display on-the-go


Friday, May 25, 2012

Torque on Android. An ODB-2 car reader application

One of the best piece software for Android is Torque.






Torque is an ODB2 diagnostic app than can be used to clear error codes and diagnose your car. Pretty much all cars after 2003 or so have ODB-2 as a standard protocol. You can buy a ODB2 reader at your local Kragen or get your fault codes cleared at your dealership. Torque, brings that capabilities to your smartphone.


This is a $5 app that deserves some attention and praise. To use this, you need a bluetooth ODB-2 reader. The most common is the ELM327 bluetooth dongle that can be found on ebay. Most of them are cheap no-name Chinese no name ones. I wouldn't be able to tell you which one works better than the other. I got lucky, mine works.




I use it just to fart around. Nothing serious. You can time your zero-to-sixty times or measure the oil pressure if your car doesn't have a oil gauge. Frugal types can check your MPG in real time. There are countless of things you can do with and it helps you get a better understanding of your car. All my cars are fairly new so I don't have to worry about clearing fault codes yet. There is a GPS tacker built in so you download your tracks. You can use this as a blackbox computer recorder. .


I do have one funny story to tell. Last year, I was looking to buy a used truck and brought my ODB2 setup along. The seller (in a used car lot) panicked when I told him I was going to check the car for any error codes. He quickly shun me away and didn't want to sell me a truck any more.



Nagios


Nagios is billed as a network, infrastructure monitoring app.


There is a wiki definition of it. http://en.wikipedia.org/wiki/Nagios





To me, it has saved my butt more than a hundred times. In simple terms, it is a system I use to monitor my company's network, server and entire IT assets. When a mail server goes down, I am instantly paged and notified. This is the tool used to monitor your servers,switches, hardware for downtime.


This is one of the killer "Linux" apps. Sure, it probably runs on other platforms. And sure, there are probably other 'infrastructure' monitoring apps that work with Windows with minimal fuss. However, Nagios is free and runs on minimal hardware. In essence, cheap. It takes an afternoon of your time to configure some /etc/ files. Trust me, you will be rewarded in so many intangible ways.


Nagios was one of those first examples of how IT snuck in Linux boxes into the closets of most enterprises. About 7-8 years ago, I built an inexpensive Linux rack server from the local white box computer shop. Back when I was younger and piecing motherboards together was my idea of fun, Linux started to become my de-facto go-to solution for everything.  The entire cost of the project was a few hundred dollars using the cheapest PC components we could scrape together.

With Nagios, I would know way beforehand when a mail server went down. I would prefer a machine to tell me something is wrong versus the boss calling and telling me why he/she can't get his mail.

The years went by and the box did its job. About two years ago, as we started to consolidate physical machines into virtualized machines. It also gave me a reason to upgrade to version 3 from whatever I was running previously. I rebuilt a Nagios instance on a small JeOS (Just enough OS) build of Ubuntu. It was very minimal, very small, and very lightweight. Console only, the VM image was portable enough to be put on a small USB stick. That is the power of Linux.



In short, all you really need to do is make some config files and enable the service. It runs as a daemon. The most common action is obviously notification. You can also script external commands. A nifty thing to do is launch a VM failover in the event that a primary server is unavailable.
Typical email responses from Nagios



Everything is routed to my iPhone. I've been blogging about various Android devices in my other posts so now it is time to give some iPhone love. The iPhone/iPad is the only tool I depend on using with Nagios.I can access and configure my Nagios box from anywhere my iPhone has a signal:









1) I have a secured CISCO VPN tunnel  that works flawlessly with iOS (It has problems with ICS sandwich). If I can't connect to my network, a whiz-bang dual core 4.7 720p screen is of no use to me.
2) I get notifications via PUSH notifications via dovecot/cyrus IMAP mail server. My Galaxy Nexus only supports PUSH emails via GMAIL or Exchange. K-9 email client in the Play store doesn't work for me.
3) Excellent console access. I prefer the SSH  app on my 3.5" small iPhone screen over Android's Connectbot running on my 4.65" Galaxy Nexus or 7" Galaxy Tab 2.

It is all about usability and the iPhone works. Android is flakey with PUSH IMAP and others have suggested I use SMS. It is critical that I get notified within a few seconds versus 5-10 minutes later. Now, you see why the iPhone works for me.



You can configure Nagios to send SMS but I've had both ATT and Verizon block my message because my notifications did not come from a mail server with proper MX records (e.g. if my primary mail server goes down!). In addition, the 30-40 messages that can come at once. I can see why they would be blocked it as SPAM. Hence, I configured Nagios to send alternate notifications to Cyrus/Dovecot IMAP. I love open source. I find it strange that Apple's iOS has better support for open source linux based mail servers than Google.




If you are serious about your network and infrastructure, I suggest you research Nagios and see if it will work for you.


To end, I am hoping I don't get any more  notification at 3AM  this Saturday night.

Thursday, May 24, 2012

Galaxy Tab 2 Android running Mac OS

As you can see there is a common theme here. Retro goodness and 80s nostalgia. Here we have a Samsung Galaxy Tab 2 running a classic operating system, Mac OS 7.5.

All you need is a a Mac Classic or Plus ROM, a disk image of the OS and minivmac  emulator at the Google Play Store.




If I have some time, I'll pull up a copy of Basilisk running Mac OS 8 on a HP Touchpad running WebOS.

JQUERY AJAX style upload with callback

By now, you have probably seen websites with "AJAX" style upload. Technically many of them are not really AJAX. XMLHttpRequest (AJAX) upload is not technically possible due to security limitations of Javascript.  However, we still call them AJAX uploaders because they act and feel like it to the end-user; meaning the web pages are posting files without refreshing.

So where are the "AJAX uploaders?"


Many of them are actually  SWF (Flash based) uploaders billed as "jquery file upload plugins." I'm sure many of them work great but I prefer to avoid Flash as much as possible.

There are also HTML5 support in some new browsers for asynchronous file uploads via AJAX post but I've had problems with some browsers like Safari and problems with different file types.


Today, I will show you how to simulate an AJAX upload without the use of Flash. If you have done some googling, the most common way to do it is to use a hidden iframe. This is considered a hack but it works. There are some tutorials out there but mine will show you how to get a callback from your upload script using Jquery. You can use pure Javascript but Jquery is very convenient.
A callback will be a JSON reply that the host page (the one doing the upload) can retrieve and act upon. For example, if the upload failed, you can notify the user. Or you can pass the record ID of the file after it was stored in a database.


 First, you need to add a hidden iframe (mine is called upload_hidden) to your upload page.





 <body>  
 <iframe id="upload_hidden" name="upload_hidden" src="blank.html" style="display:none;"></iframe>  
 <form enctype="multipart/form-data" method ="POST" action ="upload_json.php" id ="upload_form">  
 <input type="file" name="upload_file"><button name="Upload" type="submit" value ="Upload">Upload</button>  
 </form>  
 </body>  


Then you need to set the form to post to the hidden frame.


 function setTarget() {  
   document.getElementById('upload_form').onsubmit=function() { document.getElementById('upload_form').target = 'upload_hidden';}  
 }  

 Then make sure you call it onload.


 window.onload=setTarget;  


Now for the pseudo callback. The trick is to check every time the iframe loads new content and parse the results. The way I do it is to embed my JSON reply in a div from the upload script.

After processing my upload, my PHP code generates the JSON wrapped in a DIV.


 <?php  
 $finished = array ('status'=>'success','time'=>date('Y-m-d H:i:s',time()),'db_insert_id'=>$record_id, );   
 echo "<div id ='upload_status'>";  
 echo json_encode($finished);  
 echo "</div>"; ?>  

And here is my JQuery code to parse the JSON from the PHP loaded into the hidden div.
  $(document).ready(function() {  
   $('#upload_hidden').load(function() {  
     var a = $("#upload_hidden").contents().find("#upload_status").html();  
     if (a !=null) {  
     var obj = jQuery.parseJSON(a);    
       if (obj.status == 'success') {  
           alert ("file to saved to db as " + obj.db_insert_id);      
         } // #end success  
       } // #end a!=null  
     }); // #end upload_hidden load  
 });  

The key thing to note are:
a = $("#upload_hidden").contents().find("#upload_status").html();
 and
obj = jQuery.parseJSON(a)

Every time the iframe is loaded, I look inside the iframe for anything inside a div called "upload_status." When the iframe is initially loaded with a blank placeholder, nothing happens because the content is empty. However, when it detects anything, I parse whatever is inside the div as my JSON string.


After uploading a file, here are the results. Obviously, you would need to hide the iframe after you do some testing.

To wrap up. This is one way to handle callbacks in an pseudo-AJAX file upload. There are probably other ways to do it but this was something quick and works for my needs.
Here is the example code.
 <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">  
 <html xmlns="http://www.w3.org/1999/xhtml">  
 <head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />  
 <title>Example</title>  
 <script type="text/javascript" src="http://code.jquery.com/jquery-latest.js"></script>  
 <script type="text/javascript">  
 function setTarget() {  
   document.getElementById('upload_form').onsubmit=function() { document.getElementById('upload_form').target = 'upload_hidden';}  
 }  
 $(document).ready(function() {  
   $('#upload_hidden').load(function() {  
     var a = $("#upload_hidden").contents().find("#upload_status").html();  
     if (a !=null) {  
     var obj = jQuery.parseJSON(a);    
       if (obj.status == 'success') {  
           alert ("file to saved to db as " + obj.db_insert_id);      
         } // #end success  
       } // #end a!=null  
     }); // #end upload_hidden load  
 });  
 window.onload=setTarget;  
 </script>  
 </head>  
 <body>  
 <iframe id="upload_hidden" name="upload_hidden" src="blank.html" style="display:none;"></iframe>  
 <form enctype="multipart/form-data" method ="POST" action ="upload_json.php" id ="upload_form">  
 <input type="file" name="upload_file">  
   <button name="Upload" type="submit" value ="Upload">Upload</button>  
 </form>  
 </body>  
 </html>  

Icade. Awesome retro goodness for iOS and Android


I got this last Christmas and I am still digging it. This nifty cabinet brings back wonderful 80s retro-style nostalgia. Designed for iPad, it also works for Android devices. You can get it from ThinkGeek.

Here is is with a Galaxy Nexus.







And a $99 HP Touchpad running CM7 (also works with ICS). Street Fighter!




Many iOS games need to be designed for the ION icade controller. I have a couple of favorites including AirAttack HD and Atari Greatest Hits.

However, the most important game is iMame4all (http://code.google.com/p/imame4all/)
Imame4all for both iOS and Android support the icade. The controls work off simple bluetooth. It takes regular AA batteries and it runs for months.

Dig Dug, Defender, Missile Command, Frogger, Street Fighters. Yes, this brings back a lot of great childhood memories.

Wednesday, May 23, 2012

Waiting for the 15" MacBook Pro refresh

I have two SSDs and a 27" Thunderbold Cinema Display waiting for a new MacBook Pro refresh.   I seriously hope the new MacBook Pro do not omit the ability to swap out drives.

Turn an iPad3 into a chinese knock-off Macbook Pro



Does this look familiar? It is an iPad made to look like a miniaturized MacBook Pro.

I originally spotted this bluetooth case/keyboard combo after reading an article on 9to5Mac. This was long before I saw the aluminum Brydge keyboard on kickstarter. The Brydge version is expected to debut this fall. I am sure the Brydge will be superior since it is made from aluminum vs abs plastic.

However, for many people, this may be the next best option for someone who needs something now.

So what can I say about it? This keyboard-case combo enclosure is made somewhere in the far east. There is no trackpad but the keyboard keys have a strong resemblance to modern day macbooks. Like other iPad keyboards, there are function keys for music, fast forward/rewind, brightness, search, and home.  Copy-n-paste and other keyboard short cuts work with various iOS applications.
There is a USB port for charging. Three switched adorn the top: bluetooth toggle, power toggle, and usb charging toggle.

Look below, I can now use my iPad as an expensive SSH terminal. I can say typing on it is not that bad. Using this keyboard with iSSH or Pages is pretty good.




The bluetooth keyboard case comes in two flavors – iPad 2 and iPad3. Make sure you choose the right one. My iPad 2 did not fit in my iPad3 case. As you can see, it adds a considerable amount of bulk. Pictured below is an iPad3 in the case compared to the ultra thin iPad 2.



In short, it looks good and does the job. I like the ability to charge the iPad through USB with its high capacity battery.


However and this is a big however, it does feel flimsy and cheap. After a full month of usage, I can see the plastic paint pitting. The enclosure is a bit loose and the iPad can easily fall out.  Hence, I have mixed feelings about it. It works for what I use it for. I guess I'll be one of those waiting for the Brydge when it comes out. But for the next 5-6 months, I will have one of the coolest iPad cases around.

Adventures in USB OTG



One of the greatest feature of “certain” Android devices is the ability to connect to USB devices via OTG (On-the-Go). Your phone or tablet is no longer the device but the host. You have the ability to connect USB drives as mass storage or the ability to plug in a mouse or keyboard.

For me, the coolest thing is the ability to use a phone or tablet as a vt100/hyperterminal/minicom terminal. For years, in data centers, I used a IBM Workpad Z50 (Windows CE) or a Windows laptop running hyperterm to connect to switches, routers, firewalls, UPS and the likes.






For less than $2 (Amazon Link )
you can get an OTG cable to work with your Android device.

I was able to successfully connect a Galaxy Nexus and a Galaxy Tab 2 7” (using a different 30-pin to OTG cable) to various switches and firewalls.

That was a geek epiphany moment