Converting Video Generated by Go-To-Meeting

Go-to-meeting, the online collaboration and screen casting tool produces video using its own special codec, the g2m3. It makes for really small files, but is not a standard codec installed in any kind of video editing package.

All is not lost. Go-to-meeting provides this codec on their site for windows media player. Copying the codec files into your favorite video editor’s codecs folder will enable it to read these files. For this example, I’ll be using the free version of the ‘any video converter’.

Install both the gotomeeting codec from their site and the ‘any video converter‘.

The go-to-meeting codec installer will place the codec dll files here:
C:\Program Files\Citrix\GoToMeeting\366

Copy the DLL files into the codecs folder of the ‘any video converter’.
C:\Program Files\AnvSoft\Any Video Converter\codecs

Now avc will be able read and work with these files.

Thanks to the owner and viewers of the smartbear blog for pointing me in the right direction.

Simple RSS Reader Examples

This sketch is for two really simple RSS readers.
One with php and one with javascript.

RSS or Really Simple Syndication is an text based format for publishing news and information. Being text makes it easy to manipulate with your language of choice. I like php and javascript so I’m using php and javascript for this example.

Download my code for these examples

Files included in this example

Files included in this example

First lets look at the php example:

If you open reader.php, you’ll see two functions:

One is a php environment test that checks out your php instance to make sure the functions  using are turned on.
Some hosting companies disable some of these extensions. If you don’t have direct control over the server environment, its always good to do a little probing to make sure things are going to work as expected. It also has a constant at the top  of the page which will disable the test. Once you’ve run this code on your server successfully, there is no need to waste CPU cycles on this test with every page load.

The second function is the rss reader. Lets looks at the feed data so we can understand what is going on here.

Example RSS2.0 feed data:

<xml>
<rss>

<channel>
<item>
<title>Welcome to my blog</title>
<link>http://127.0.0.1/my-blog</link>
<description>A simple example from my blog</description>
</item>

<channel>
</rss>

With SimpleXML, if I want the title from the first post your code would look something like this:

$strData = file_get_contents(‘http://www.127.0.0.1/rss);
$oXml = simple_xml_load_string($strData);
$title = $oXml->channel->item[0]->title;

PHP allows urls to be loaded with any of the file functions as if they are a file on the local filesystem. As such, I can use file_get_contents() to load the data from the url into the string. (This depends on PHP’s ‘allow_url_fopen’  setting, which must be enabled.)

SimpleXML is doing all the hard work. Its going to turn that string of xml in to a set of nested objects which will allow us to programmaticly get at the data in the xml document without having to build a complicated custom string parser.

In the example, all operations with SimpleXML are enclosed in a try-catch block. This is because SimpleXML will throw exceptions the XML is not formed correctly(missing open or close tag, including reserve characters where there shouldn’t be any, ect…) , or if you try to access an element that does not exist.

You’ll also notice from the example that third parameter of ‘simplexml_load_string’. By giving the function ‘LIBXML_NOCDATA’, SimpleXML will automatically convert any <![CDATA] blocks it encounters into strings. This sort of block is used to safely enclose characters that would otherwise be reserved by the xml format without breaking the XML format. Without that option, our feed will return an empty SimpleXML object every time it encounters a <![CDATA] block.

That is all you need to read XML with PHP.

One note: Most servers send XML as ‘UTF-8’ instead of simple ASCII text. If your feed reads some UTF-8 text and tries to  display it as ASCII text without properly converting it first, you may notice artifacts in the text that don’t make sense. These kinds of issues can be fixed with PHP’s multi-byte string functions at the expense of added complexity to the feed reader code.

This is not ‘production’ code.

The php example is not really production code. The PHP process is blocked while waiting for the feed to be read. It won’t continue loading the page until the feed is loaded or the connection between your server and the feed server times out.  A workaround to this is to load the feed and cache it on the server as a file or in memory with a tool like memcache. In most cases this scales well and mitigates the dependency on the other site but does not eliminate the problem.

Reading feeds with Javascript:

A javascript example from scratch would be a lot more complicated that this php example so I’m using Jean-Francois Hovinne’s jQuery plug-in, jFeed.

This implementation has the added bonus of dealing well with other character sets like UTF-8.
The files included in the example code:

  • jquery.js – The core of the jQuery javascript library
  • jquery.jfeed.pack.js – A mini-fied version of the jFeed plug-in
  • reader.js – A simple reader which works just like the php reader, except with javascript.
  • proxy.php – Built on Jean’s example with an extra security check.

In this case, the page will load completely before the jFeed library tries to load the feed. jFeed will use a XMLHttpRequest to load the feed data and jquery’s DOM function to parse the data giving us access to the data elements similar to SimpleXML in php.

Once the feed is loaded, it then uses jQuery’s DOM function to add the content to the page. The user will notice some lag between the page loading and the feed loading, but the feed won’t slow down the rest of the page as it did with the php example. Caching could also be used here with a quick php script to speed things up a bit.

Javascript does have limitations. You won’t be able to read a feed from another domain, which is why Jean-Francois included the ‘proxy.php’ script. This script loads the feed and presents it to the javascript as if it came from the same domain. He says, “don’t use this in production” and I would agree. I made on little change to his script, adding a check to make sure the request is coming from local host and not some external site using us as generic proxy which should make it a little safer to use. Having something like this on your server also opens your domain to XSS attacks.

Javascript Example Code

To read the feed, include the required javascript file in your documents head (which are included in the sample code):

<html>
<head>

<script type=”text/javascript” src=”jquery/jquery.js”></script>
<script type=”text/javascript” src=”jquery/jquery.jfeed.pack.js”></script>
<script type=”text/javascript” src=”jquery/reader.js”></script>

Add a div tag place holder for where you would like your rss feed to appear

<div id=”feedReplace”></div>

Add this just above the closing body tag.

<script type=”text/javascript”>getFeed(‘#feedReplace’, ‘http://yoururl.com/rssfeed’,5);</script>

When the page loads, the javascript will load the rss, parse it, format it and append it into the page at the location specified. You can also specify how many items it should show. The example above shows the top 5 items.


Which approach is better?

If the feed data is a featured item of your site it may make more sense to use php because you can be sure the content will load. If the feed is just extra on a larger site, javascript makes a lot of sense because it doesn’t bog down the site while feed is loading.

Cogs of the web in plain english

A while back a client asked for help bringing their great idea to life, but really having no clue where to start or how websites really get built. This was my explanation of the web development lingo in plain English. Hopefully it will be useful to someone else.


Get a domain name via a registrar…

For people, names are usually easier to remember than numbers. Thats why a bunch of smart people at University of Wisconsin came up with a system to convert a text ‘name’ into the number that identifies a computer on the Internet. ICANN is the organization that manages requests for domain names. The registrar acts as a broker, doing the paperwork to request a domain name on your behalf.

Its a lease, not a purchase.
When a name is registered, its ‘owned’ by you for a period of time specified in the paperwork. You can extend that time-frame at any time before the domain name ‘expires’. The shortest period of time you can register a domain for is one year. ICANN allows the registrar to set their own fees, so you’ll see different prices from different registrars.

Popular registrars in America include GoDaddy.com, Network Solutions , Register.com, Tucows, Web.com

Registrars also also provide another service called domain name service or DNS. This is the actual server responsible for pointing your domain name to the correct computer on the net. This allows you to register your site with say GoDaddy and actually host it somewhere else. (like your own server room)

In my experience, GoDaddy is the least expensive. I also like their on-line tools for managing DNS and details about the domain a lot. They are pretty intuitive. You should expect to spend between $10 and $50 a year per domain name.

One gotcha they don’t explain very well on any of the sites; When registering your domain name you’ll be asked to provide contact information. Its important that you fill in current and correct information here. If there are any issues with your domain, the registrar with attempt to contact you with the information on file. If the action (transfer, sale, cancel) is not disputed in 90 days, the action is accepted by default. I worked for someone who allowed an employee to register their domain name, and when he was fired about a year later, he took the companies domain name with him. (remember e-mail relies on your domain name too) It took them two years to get the mess around that name straighted out. ICANN’s arbitration process is very slow.

Now you need a server…

Web hosting leasing space on a web server and connectivity to the Internet so you can share files/pages with people over the Internet. The web host maintains the machine, web server software and networking infrastructure as a service. Most web hosts have a guaranteed uptime, or % of the time that your site will be accessible via the Internet. (99.9% uptime roughly equates to 45 minutes of downtime a month)

There are also several types of hosting:

Shared hosting: Servers are expensive. The more people a web hosting company can serve from a single physical server, the more money they make. This makes the service very affordable (as low as $5 a month) but at the expense of performance. All the sites hosted on the server compete for resources(CPU cycle, disk space, RAM). One busy site can slow the whole server down.

Virtual Private Servers(VPS): With special software a physical server can be split into several logical servers. (Think Virtual PC where you can run microsoft windows on your mac) You can rent one of these virtual machines and have a higher level of access to the machine and a guaranteed set of resources without the expense of leasing a whole a server.

Dedicated hosting:
Dedicated hosting is leasing a whole physical server. This means your not sharing computer resources with any other sites. Its also one of the more expensive options when it comes to hosting. ($99 to $1000/month)

Co-location:
This is a dedicated server where you bring your own computer. Your renting space in the data center and network connectivity. This level of service is very expensive ($500/month and up) and does not include support for the hardware or software.

Cloud/Grid hosting:
Similar to VPS, except your use of resources like CPU and memory is metered. You pay for what you use. If your site gets really busy, you bill will be higher, but the performance of your site will not degrade because your server is really spread across thousands of physical computers. Services like this include AppJet, Google AppEngine, Amazon EC2, Microsoft Azure Services, ETC.
Because the work is spread across many physical machines, the likelihood of a machine failure with a cloud setup is very low. The rates often very competitive. These cutting edge services are often not a easy to utilize as a simple hosting solution. Expect that to change soon.

A web platform
When web hosts talk about the ‘web platforms’ they support they talking about different application layer and database layer technologies they support. The application layer is where the logic of the web application lives. The database layer is where most of the data used by an application is stored. That said, there are many combinations of the two that are popular. The technologies Popular application layers are PHP, .NET, Java, Ruby, Perl and Python. Popular databases are MySQL, MS-SQL, Oracle, Postgre and DB2.
Secure servers
To have a secure connection to a webserver requires an SSL (Secure socket layers) certificate be installed on the web server. This is a unique identifier that shows the issuer has verified your business and trusts you. The more trusted the issuer, the more expensive the certificate. Technically, a certificate issued by the cheapest provider is just as secure and the most expensive one. The only difference is the process they use to verify you.

Additional Recommended Services:

Version control
Version control is a system which can manage source code produced by many developers and make sure they don’t overwrite each others changes while they work on the same set of code. SVN is a free tool and can be installed on most servers. Unfortunatly, most hosting companies don’t allow you to install extra software. There are several companies which provide ‘hosting’ for source control. Most of these companies also provide other tools along with these accounts to further enhance developer collaboration.

Issue tracking
Issue tracking is an automated system for tracking features and bugs in software. Its vital to a collaborative development environment. So much so that all of the version control companies I listed above include it with their services at no extra cost. There are also several open source tools like mantis, trak, bugzilla, and jira. At the service center, we use mantis. I also use mantis with other clients hosted from my godaddy account.

Serving Office 2007 documents to IE Users with Apache

I encountered an issue downloading Office 2007 documents with IE. When downloaded IE would download them as ZIP files instead.

In reality, these are zip files. Our users need them to open in the correct application.

The solution was adding the new mime types to apache.
I was unaware that apache provides mime type info for each file served or that IE actually used that information.

Adding the following my apache config fixed the problem:

    #Types for the new office documents
    AddType application/vnd.openxmlformats-officedocument.wordprocessingml.document .docx
    AddType application/vnd.openxmlformats-officedocument.presentationml.presentation .pptx
    AddType application/vnd.openxmlformats-officedocument.spreadsheetml.sheet .xlsx
    AddType application/vnd.ms-xpsdocument .xps

References:
http://blogs.msdn.com/dmahugh/archive/2006/08/08/692600.aspx
http://httpd.apache.org/docs/2.2/mod/mod_mime.html#addtype

Send a file via POST with cURL and PHP

cURL is a great library. It can do just about anything that a normal web browser can do including send a file via a post request.

This makes it really easy to transmit files between computers. In my case, I was looking for an easy way to send images snapped by various webcam systems to a central server with php managing the images.

Here is a simple script to send a file with php/cURL via POST:

 '123456','file_contents'=>'@'.$file_name_with_full_path);

        $ch = curl_init();
	curl_setopt($ch, CURLOPT_URL,$target_url);
	curl_setopt($ch, CURLOPT_POST,1);
	curl_setopt($ch, CURLOPT_POSTFIELDS, $post);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
	$result=curl_exec ($ch);
	curl_close ($ch);
	echo $result;
?>

And here is the corresponding script to accept the file.

';
	if (move_uploaded_file($_FILES['file_contents']['tmp_name'], $uploadfile)) {
	    echo "File is valid, and was successfully uploaded.\n";
	} else {
	    echo "Possible file upload attack!\n";
	}
	echo 'Here is some more debugging info:';
	print_r($_FILES);
	echo "\n
\n"; print_r($_POST); print "\n"; ?>

And that's it.
Navigate to the 'send' script and it will transmit the file sample.jpeg to the accept script.

Note that you can include other arguments in the post along with the file. This allows you to authenticate the upload. I'm using pre-shared strings to 'validate' that upload came from my send script.

This works with the command line version of php too.

 

UPDATE Oct. 2014 – 

If you're using PHP 5.5 or better, check out the recently added 'CurlFile' class which makes the whole process a lot easier.

 

References:
http://us3.php.net/manual/en/function.move-uploaded-file.php
http://us3.php.net/manual/en/features.file-upload.post-method.php
http://curl.haxx.se/libcurl/php/examples/multipartpost.html
http://forums.devshed.com/php-development-5/php-curl-send-a-file-533233.html