Extract OCXState Data

At work we have recently spent effort to remove our technical debt. So far we have introduced Unicode into the native code, migrated our VB6 to VB.NET, upgraded to .NET 4.0 and now we are working on 64bit support.

However, we have used the Microsoft FlexGrid quite extensively in our VB6 (now VB.NET) components and it does not support 64bit, therefore we decided to build a compatible control which wraps the existing WinForms DataGridView control.

We discovered quite early on that the properties set through the designer of an ActiveX control does not generate designer code, but instead it is serialized as a resource and restored through the OcxState property.

Always being the one to reduce the workload I devised a way of extracting this information so that we can redo the designer set properties. I found out that I could give the control another OcxState from another ResX and I could just look at the changed properties.

In the project I created a form with the desired control on it, I hacked around with the designer generated code to allow me to override the OcxState.

I used the ResXResourceReader and the ResourceSet classes to get the OcxState from my desired ResX source (the project that has the state in it I want to know). Then using reflection I compared the default state of the control (using the OcxState that the designer created) to the state I desired (with my own OcxState extracted from another ResX) and I now know what properties were changed on the designer.

MSBuild creating lots of temp files


I found a msdn forum post with exactly the same issue: http://social.msdn.microsoft.com/Forums/en-US/netfxbcl/thread/42647eff-ecb8-412c-a884-a152b6fdd40d

It turns out that it is NOT msbuild but it is SN.exe leaving the temp files behind when resigning assemblies. I suppose you could easily come to this conclusion when watching the build and those files are flashing by so quickly.

On reflection a custom task will not help either…

This post has more details on the issue with a link to a hot fix (haven’t tried it yet): http://blogs.msdn.com/pfedev/archive/2008/10/24/sn-exe-and-empty-temp-files-in-temp.aspx

Continue reading “MSBuild creating lots of temp files”

Simpler version of the 7-Zip backup

I have tested the 7-Zip backup I had constructured that I posted about a while back and I have come to realise that it is a tad bit too complicated. I even went through the process of creating a proxy application to try and get it to be a bit more cleverer but in the end I simplified it down to just 4 parts; the logic, settings and file lists.





How to use it

Place the batch file and settings files in a folder somewhere.

Update the setttings file with the correct paths.

Pass the path to the settings file to the backup batch file.

Or create a scheduled task.

Using 7-zip and batch files to perform backups automatically

UPDATE: I have recently overhauled this and created a simpler script in this post: /2009/10/17/simpler-version-of-the-7-zip-backup/

First, a little history:

Since the days when I started using XP I had gotten used to the control that NT Backup provided to allow me to backup my files.

About a year ago I moved to Vista and was unhappy with the “All or nothing” approach that the bundled backup software provided. I was also using OneCare at the time and it followed a similar policy.

Since moving to BitDefender I was much happier with the backup software it provided yet the fact that I was not using a well known backup format rubbed me.

So, using this article: http://www.wilshireone.com/article/115/easy-backups-with-7-zip as a guide, I engineered my own backup mechnism using 7-zip.

Now heres the fun. My backup schedule consists of a weekly incremental backup and one monthly full backup. My files are not so important that I need frequent backups, this schedule suits me down to the ground.

My solution consists of 7 parts:

  • Backup.cmd, this is the main script that executes the command
  • Backupset.txt, a list of files and folders that I wish to backup
  • CurrentSet.txt, a path to the current backup file. Used by incremental backups to update
  • FullBackup.cmd, the script to run a full backup
  • FullBackupSettings.ini, the settings used by a full backup
  • IncrementalBackup.cmd, the script to run an incremental backup
  • IncrementalBackupSettings.ini, the settings used by the incremental backup

Now you could remove the separate scripts for the full and incremental backup, the reason I created separate scripts was that I don”t have to change the arguments in the task scheduler. Instead it is all controlled through those scripts.


The backup script loads the settings passed as argument 1, does some checking, then calls 7-zip to begin backing up. The settings files define how the files are added and where to etc.


This is the list file passed to 7-zip. The only downside is that you cannot list the same folder or file name twice in this file. The work around would be to invoke multiple backup scripts and then add the duplicate folders and nested zips.


This contains the backup file created in the last full backup. Incremental backups then read it in and use it to update files.


Simples, it just calls the main backup script with the correct settings.


The settings file is read in and used in the main backup script. Note the mode and backup type, this is what separates the two types of backup.


Similar to the full version, except its providing the increment backup settings


Similar to the full settings, but with the zip mode set to update and its backup type defined as incremental.

Once that has all been put in place, all I do is create two tasks in the windows task scheduler.

You could go one step further and have the script copy the backups to another disk, I just do this myself when I feel like.


After a little testing I found a flaw in the technique that I was using. Essentially what is happening is that new files are being added to the old archive and any deleted files were being left. I want to keep any files that I deleted but I do not want them muddying up the actual latest image.

Instead, what I did was use the -u switch to stop updates to the base archive being made and adding new files to a new archive, which makes it incremental in its true sense.


In fact, I have gone one step further and I have built a command line tool that wraps 7-zip and provides a nicer mechanism for configuring the backups. It then executes 7-zip. Once I have tested it a little I will post the source code.

Unit testing with NetBeans and phpUnit

Huzzah! I have found a way of being able to run the Unit tests in NetBeans.

Since phpUnit is essentially a set of php scripts itself, I thought to myself there must be a way of invoking it.

What I did was setup a source file as the boostrap for the Test Runner, set this file as NetBeans index file and from there you can just run it and see the test results in the output window or debug and it step into the test you are interested in.

It is not a fantastic solution but it is definately a stop gap until they do implement php Unit testing.

Heres my boostrap file for running tests:

Its very simple, what I am doing is hard coding my arguments into the server environment variable (you could use the arguments in the NetBeans configuration) which is where phpUnit gets the arguments from.
Including the phpUnit TestUI test runner which has the main entry point, excluding this file from code coverage and then I just sit back and l let phpUnit run its course.

This is only a work around until Netbeans supports phpUnit.