[UPDATED: Saturday, March 08, 2008 at 10:26:56 AM]
I've been playing around with e - TextEditor this weekend to see if maybe it'll replace Textpad for me. I've always liked the speed of Textpad and it does a really good job on large files. I use Eclipse as my main IDE, so when it comes to a Notepad replacement my two main requirements are fast load times and the ability to handle large text files (for reading logs.)
However I keep seeing some of the really cool things that TextMate can do, so I've been keeping an eye out on e - TextEditor. It has a couple of features that would make quick editing of HTML files extremely easy. I really like the ability to selected a bunch of words or tags in a document and easily replace them.
One of the key features of e is it's TextMate bundle support—which really allows you to extend the functionality of the program. While surfing the e forums, I came across a Bundle command which allows you to open the file you have selected in the document (or the file where the caret is positioned.) This script makes it really easy to open files being loading from <script />, <style /> or any other tag you may use for loading files.
The original script, posted by tanguyr, I found either needed you to add e.exe to the Windows PATH environment or you needed to hard code the path to e.exe. I fixed the problem by using one of e's environmental variables. The $TM_SUPPORT_PATH variable points to a sub-directory above where the e.exe file exists, so I just use a relative path to get to the e.exe executable.
To add this command to e, go to the Bundles > Edit Bundles > Show Bundle Editor. I added this script to the Source bundle, as it seemed to make the most sense to me. Select the Source folder and click the + button and select the "New Command" option. I decided to call the command "Open Target Document", if you don't like the name choice something you like better.
Paste the following Bash script into the source area.
As I mentioned early I've been working on a SVN post-commit script. We've got a SVN repository that will be modified by several remote developers and I really need to keep an eye on this repository and I need to closely monitor changes to this repository.
There are two major functions that I needed in my post-commit script:
There are an abundant of examples showing off how to do this in various *nix flavors, but I couldn't find any good Windows-based solutions that didn't require Perl to be installed on the server. That led me to create the following post-commit.bat script.
I've been working on a post-commit hook for our Subversion install and was running in to a number of issues. The post-commit.bat file would run fine from command line, but I just could get things to work as I expected from SVN. After much debugging and scouring Google for answers, I've found a few tips that will hopefully help you to troubleshoot your own SVN repository hooks.
This was the biggest issue I was running in to, because I was expecting the my script to be able to find any programs in my %PATH% statement. That's the main reason my scripts were working fine from command line, but were breaking when executing from my SVN hook.
There were two important blog posts I came across yesterday that detail similarly related problems in ColdFusion 8. Both issues revolve around changes to the JDBC drivers in ColdFusion 8 and they're both essentially transparent issues (unless your running SeeFusion.)
The first issue relates to a new feature in the datasource configuration called the "Validation Query." ColdFusion 8 introduced the ability to define a query that run each time a connection to a DSN is reused after a period of inactivity. This is useful in cases where the CF server might lose connection to the database server due to network issues.
As Daryl Banttari of Webapper reported yesterday, the problem is there's a bug in the Admin API that causes the "Validation Query" for each DSN it touches to be set to the value of "0". This causes ColdFusion to silently through a java.sql.SQLException each time the validation query is executed. Since this is a silent exception, you have no idea it's happening.
This is actually old news—but I thought it was important to be reminded of it. Last year Yahoo's UI blog posted some findings on user browser cache usage. Here's what they found:
40-60% of Yahoo!’s users have an empty cache experience and ~20% of all page views are done with an empty cache. To my knowledge, there’s no other research that shows this kind of information. And I don’t know about you, but these results came to us as a big surprise. It says that even if your assets are optimized for maximum caching, there are a significant number of users that will always have an empty cache. This goes back to the earlier point that reducing the number of HTTP requests has the biggest impact on reducing response time. The percentage of users with an empty cache for different web pages may vary, especially for pages with a high number of active (daily) users. However, we found in our study that regardless of usage patterns, the percentage of page views with an empty cache is always ~20%.
I definitely recommend reading the whole post as there's lots of good information on optimizing your site for performance. This article is just one posting in a series of posts on how to improve your site's HTTP performance.
Microsoft has released an IIS deployment tool which allows you to sync settings between IIS servers or to migrate settings from an IIS6 server to an IIS7 server. From the Microsoft Web Deployment Team Blog:
So what is this new deployment tool? You may have read Scott Guthrie’s post about the future of ASP.NET and IIS. In the post he mentioned the roadmap for a web deployment framework, that’s us. :) In our first version, we’re releasing a command-line tool called msdeploy.exe that provides support for deploying, synchronizing and migrating IIS 6.0 and 7.0.
It supports moving configuration, content, SSL certificates and other types of data associated with a web server. You can choose to sync a single site or the entire web server. Because we know that one tool can never ‘automagically’ guess what your application relies on, we’ve tried to be pretty flexible and powerful – you can customize exactly what you want to sync using a manifest file. You can also skip sites or other objects, or you can perform regular expression replacements during a sync (like changing the home directory on the destination machine).
The goal of the tool is to help you keep servers in sync, to make deployment easier and also to help with migrating to new versions of IIS. You could use a sync on two machines in a web farm, for example. Or maybe you need to move to a new server of the same version, you can use this tool. Of course, we also enable you to do a migration from IIS 6.0 to 7.0.
Microsoft has released some walkthroughs to teach you how to use the tool and has released both an x86 version and an x64 version.
John Resig decided to do a little experiment to see how browsers handle sub-pixel problems in CSS. John wanted to see what happens when you have 4 floated divs, each with a width of 25% that were contained inside a parent element with a fixed width of 50 pixels. This poses a problem since the correct mathematical width of each floated div should be 12.5px and browsers typically can't handle sub-pixel rendering.
Here's what John found:
This is actually something I've wondered about for a long time, but never taken the time to research how the browsers handle it.
I'd definitely go take a look at John's post because it contains lots of good information and comments.
There's lots of talk today about the new proposed X-UA-Compatible header which is being driven by the WaSP-Microsoft Task Force. The goal of the proposed meta tag header is to provide future compatible for IE and other browsers. The idea is you add a tag to your HTML that locks in browsers to that capability mode:
The above tag theoretically would prevent Internet Explorer 9 or Firefox 4 from breaking because of changes to their standards support by making the run just like prior versions.
While the goal is a noble one, I actually see this causing more harm than good. From past experience, something tells me that future products won't get backwards compatibility 100% correct. Since the whole point is to prevent sites from breaking with future releases, what problem have you solved if the browsers going to have bugs rendering in a compatibility mode anyway?
I like the idea that browser vendors (especially Microsoft) are trying to make web development more future proof, but I think there's got to be a better solution.
Aptana just released a new project called Jaxer. In a nutshell, Jaxer is server filter which parses files and can execute JavaScript on the server. It's like a headless version of Firefox. All your client side libraries (like jQuery) will run in Jaxer—which is pretty neat.
What this means is that the JavaScript you write can be used in both the client and the server. Jaxer actually resolves one of the most complicated problems web developers face—ensuring data is validated using the exact same rules on both the client and the server. Because Jaxer is able to execute your JS code on the server, you can write one set of validation functions and use them both place.
Aptana's posted a screencast showing a simple client/server validation example using Jaxer that I recommend viewing. It's a simple example, but shows off the potential power of Jaxer.
I also recommend checking out Ajaxian's post on Jaxer as well as John Resig's post which both provide example code and give further insight.
At this point this product has a high "Cool!" and "Wow!" factor, but I really wonder who their target audience is. Hopefully I'll have some free time to play around with this sometime soon.
Yahoo announced that they're supporting the OpenID standard for a universal log-in. This could go a long way for more web site adopting OpenID. The idea of a single log-in that you can use on any website is a noble goal.
I'm not sure I'd ever use a global login account on important websites (such as those dealing with financial information,) but for general information based sites this would be a welcome improvement to managing tons of log-ins and passwords.
Ben Nadel posted about using a ColdFusion custom tag to act as a proxy to invoke a CFC. The benefit of this technique is that you're able to invoke CFCs from outside the webroot without creating any server-level mappings.
There have been several solutions to invoking CFCs without using a mapping—including the component() UDF solution I've blogged in the past. The reason I think most people hesitate to use component() UDF—which allows for using relative paths—is that it uses underlying Java calls which are not supported. Now this UDF works in ColdFusion versions 6 through 8, I understand why people might be hesitant to use it.
Today I ran across an interesting post on cfSearching on how to display the ColdFusion JDBC driver versions on Windows. Knowing this could be useful debugging information, I decided to save the code snippet for future use. However, I quickly released it was targeted directly for ColdFusion MX 7 and also had some hard coded paths that would need to be changed in order to run it on every system.
Since I wanted to be able to run this on any version of ColdFusion, I decided to modify the code a bit so it would work with any version of ColdFusion v6.1 or higher and it shouldn't require changing any of the path information.
Here's my modified version of the code:
I also modified the original code to output the version of ColdFusion that's running on the server. Credit goes to cfSearching for the original code.
There's always tons of stuff shown off at CES each year. This year VIA showed off a prototype of 3.5" x86 PC that was running Linux. They're promoting it as an advanced PMP, but I could definitely using it as some kind of portable server.
The specs I could find on Gizmodo list the following:
Processor: VIA C7-M @ 1.0 GHz
Storage: 8 GB Flash
Display: 2.8" LCD
Resolution: 640 x 480
Dimensions: 8.5 cm x 8.5 cm x 2 cm
Weight: 150 g
Battery Life: 4 hours
Connectivity: WiMAX & WiFi
Video Codecs: H.264, WMV, RMVB, MOV, FLV, DIVX
Every now and again I have the need to test some code against against some live data. The reasons vary from simply trying to recreate a bug, testing a piece of code for performance or for testing a UI widget.
In the past the quick and dirty way I've done was to move a template into production that would perform the query and then convert the query object to WDDX. This would allow me to create a "copy" of the live data that I could port to development. The problem with this method is it makes you move temporary code to a production environment that you then have to remove.
Whenever the topic of my employment comes up, everyone's first reaction when I tell them I work from home is: "Wow, that must be really nice!" While working from home definitely has some benefits, it has some cons.
As Cameron Childress mentions in his post on Coworking in Atlanta, the two hardest things to adjust to are the lack of socialization and self motivation—both are issues Cameron and I have talked amongst ourselves about in the past.