John Croson's Blog Home: 2010

Friday, October 01, 2010

Ditch your backup tapes, use disks!

Image representing Dell as depicted in CrunchBaseImage via CrunchBaseWe've been backing up to LTO2 tapes for years using a Dell PowerVault 132T with an internal IBM Ultra drive. Differential backups from our SAN don't take too long, but full backups take nearly an entire day. The backup set has grown as the business has, and it's rapidly approaching 1tb, spanning 3-4 tapes.

About 10 days ago the internal drive finally croaked. Swallowed a tape and wouldn't let go. I was able to retrieve it using some directions found on IBM's website, but after trying another backup, only to have it fail with the same results, we realised a change was needed.

What we needed was a reasonably priced, reliable destination for our data. I knew that a replacement LTO2 drive was going to be expensive, and probably needed to be a Dell approved model that had the proper firmware to work; picking up just any internal SCSI drive wasn't going to cut it, and this was confirmed after poking around in Dells support area.

I was able to find a refurbished drive for just under $500, but saw this as a quick fix and wanted to take the opportunity to bring in a solution that matched our environment.

What we decided on was an Addonics ST55HPMXA storage tower enclosure with 5 1tb SATA drives. The enclosure will accommodate 5 drives without the optional internal racks or arrays.

It comes with a 5 port SATA Raid Silicon Image card, with an external eSata port enabling one to connect another Silicon Image card from your system to it, and using the provided software create and manage RAID arrays.

The case is solid with a 320 watt power supply, and one fan for cooling. Not sure what the noise factor is, since it sits in our rolling cabinet surrounded by 2 UPS's, an IBM pSeries, Dell PowerEdge 2950, AX150i San, and a Dell PowerEdge 2650.

RAID 5 with these four disks (disk 0, 1, 2, an...Image via WikipediaInstalling the drivers and software is straight forward, and my Windows 2003 server immediately saw the un-initialized drive. I chose to use GUID partition tables (GPT), since that allows me the luxury of resizing the partitions to an unlimited size in my case, and up to 128 partitions, a limit set by Windows. I chose to configure the drives in RAID 5, and to slice the 3.8tb into 3 partitions to accommodate my differentials, full and IDR (Intelligent Disaster Recovery) backups.

The RAID configuration software works well, although I'd like it better if the monitoring part of it ran as a service. As it is, you must run it on the console in the system tray so that you can receive alerts if there is a drive failure or other issue. Alerts come by way of email, sound (if you have a sound card), and console pop-up. Guess that's what I get for not buying an Enterprise solution...

Symantec BackupExec 12 reports that my data is transferring at about 200 MB/min which is very close to what I expected. Addonics warned me that there would be a bottle-neck at the controller card, but these speeds are almost 4 times faster than what we were experiencing to tape. Once these backups complete, there is a Duplicate Backup Sets job that runs to send this data to a Iomega 1.5tb drive for off-site storage.

Overall the cost was higher than a replacement drive ($800), but no longer are we tied to tapes, and the fallibility of them.


Enhanced by Zemanta

Wednesday, September 08, 2010

Subsystem for Unix-based Applications and Windows Batch scripts

The current logo of Microsoft Windows, the com...Image via WikipediaSome time ago, I installed the SUA package with the thought I might use it for scripting and other tasks that I find easier to accomplish using GNU-like tools rather than Windows.

Recently I needed to run a batch script that programatically checks for the evidence of some data files and injects them into MS Sql via DTS. This script worked fine on another server for years with a few tweaks now and then, and suddenly during my tests on this machine, it was failing.

What I found was that my script was using the Windows FIND command, which is also installed by SUA. When I used FIND in my script, it was using the SUA flavor, and not the Windows version.

I checked my PATH environment and saw that SUA's path was listed first, then %SystemRoot%, and %SystemRoot%\System32 where Windows FIND is installed. If I were a betting man, I'd have to say that since the SUA tools are listed before Windows, they get processed first. I've not tested this, but really didn't feel the need to since calling the application absolutely in the script (%WINDIR%\System32\Find.exe) solves my problem.

Hope this helps you!

Enhanced by Zemanta

Tuesday, September 07, 2010

Milwaukee PC Non-Customer Service

Last week Friday, a tape got stuck in our Dell PowerVault 132T Library. Luckily, I was able to remove the tape and resume order to our backups...or so I thought.

After checking in Saturday it was clear that we were going to be without backups, something that makes my skin crawl.IMG_3082Image by artescienza via Flickr

So promptly after our return to work after a nice long Labor Day weekend, I rushed out and to purchase an external drive to backup data to.

My choice for such emergency trips is usually Milwaukee PC, since they are likely to have what I need in a pinch even if it is a few more dollars.

IomegaImage via WikipediaThey indicated there indeed was a 2tb drive in stock, so I bought an enclosure and we headed over to the display case to retrieve the drive. Unfortunately the drive was no where to be found...bummer. A 1tb drive was likely not going to be quite large enough, but it was going to have to do. I started planning what data to trim from the backups to make room, then suddenly saw the Radio Shack sign on my right...what the heck, I'll check and see if they have what I need.

Sure enough, they had a Iomega 1.5tb in an enclosure. Just big enough for a full backup of all our stuff. Sold! Now to head back to Milwaukee PC to return the smaller drive.

Only 15 minutes had passed and this was the response I got from the clerk even after I expressed my surprise and dismay; "Sorry sir, but your purchase will be subject to a 15% restock fee.", and "That's our policy..."

Unbelievable. In today's climate you'd think the loss of my business and everyone else I tell this story to is worth more than $24.

Never will I shop Milwaukee PC again, and I will dissuade my customers, family and friends from making the same mistake I did.

Enhanced by Zemanta

Sunday, August 22, 2010

Today during errands, we stopped by Goodwill and picked up this excellent enameled cast iron stew pot for $10. It inspired me to cook up an Indian stew for the family.

My wife liked it so much she suggested I blog what I did, since it was all spur of the moment, like so much of my cooking is.

The ingredients are as follows, in what I remember to be the approximate quantities...

  • 1/2 vidalia onion diced
  • 2 stalks celery chopped
  • 2 medium carrots chopped
  • 1 green chili pepper chopped
  • 1 green bell pepper chopped
  • 1 1/2 large tomato chopped
  • 2 small white potatoes diced
  • 2 cloves garlic minced
  • 2 teaspoons minced ginger
  • 3 or 4 mint leaves
  • 3 or 4 bay leaves
  • 4 or 5 small green leafy nutty spice I picked up at an Indian store, but have NO idea what it is...
  • 1 teaspoon cardamom powder
  • 1 teaspoon tumeric powder
  • 1 teaspoon coriander powder
  • 1 teaspoon cumin seed
  • 2 teaspoon mustard seed
  • 1 teaspoon fish sauce
  • 2 tablespoons lemon juice
  • 1 can coconut milk
  • 2 teaspoons chicken bullion
  • 1/2 lb cubed pork tenderloin

Drop 1 or 2 tablespoons butter and a similar amount of olive oil in your pan over medium heat. The olive oil will help keep the butter from burning. Take a sip of wine, beer or your favorite spirits.

As you chop your veggies, add them to your pan. Take a sip of wine, beer or your favorite spirits, preferably between vegetable types. Stir often.

Add all the spices, stirring frequently. Let the mixture cook and allow the tomatoes to break down. Take a sip of wine, beer or your favorite spirits.

Add the coconut milk, fish sauce, chicken bullion and about a cup of water. Bring to a simmer and taste test. Adjust your spices to your liking. Take a sip of wine, beer or your favorite spirits.

Add pork, or substitute for your favorite meat. Take a sip of wine, beer or your favorite spirits.

Start making some good Basmati or similar rice. Take a sip of wine, beer or your favorite spirits.

Let your stew simmer uncovered for about 45 minutes, or until the veggies are al dente.

Eat lots, and don't forget to take a sip of wine, beer or your favorite spirits.... ;-)

Enhanced by Zemanta

Wednesday, June 09, 2010

Ubuntu AMR Codec

What a PITA. I have a cool smart phone that takes video, and can't get audio to work in Ubuntu, since Intrepid and Jaunty don't appear to come with AMR codecs compiled in ffmpeg, or gstreamer.

Now, you can try to compile your codecs from source, and I've certainly done this sort of thing before; I can build my own kernel (In the days BEFORE xconfig), run Slackware/Gentoo, etc. This just didn't work for me.

BUT, there is an easy solution.


Image representing YouTube as depicted in Crun...Image via CrunchBase

That's right. Upload your video to YouTube, wait for it to process, and download the resulting MP4 from the My Videos area of the control panel. Presto, you've got your audio converted to something usable in short order.

Enhanced by Zemanta

Windows 2003 Migration

I haven't had to do a migration for some time, since the days of consulting have been over for 3 years now.

IMG_2459Image by mooing brontosaurus via Flickr

Even then, I'd only done a couple of 2000 to 2003, and one SBS 2000 to SBS 2003 migrations.

This project involved moving from SBS 2003 to a Standard 2003 environment. I was a bit intimidated, and purchased the excellent Swing It! SBS Migration kit from Jeff Middleton. One good thing about it was that Exchange had been moved to a hosted solution last year, and they didn't use Sharepoint.

Turns out this was a good purchase for a guy doing a solo project with few peers to bounce questions or ideas off of. Jeff freely provides answers and guidance in his forums, and his expert help is just a phone call away if you are current with your subscription. Pre-pay, and your support requests are responded in an even quicker manner.

Fortunately, I didn't need to ask but a few questions, and I was on my way with the docs, and tool-sets provided by the kit.

I started the process by setting up a temporary DC in a virtual environment, then basically took the following steps to gradually move away from the old SBS box:

  1. Migrate all file, print, dhcp, http, etc off of SBS to the temp DC or a member server.
  2. Promote temp DC, and give it a copy of the GC.
  3. Check Application log for successful AD transfer, DNS.
  4. Install WINS if necessary on temp DC and create replication partner to SBS for transfer of info.
  5. Take temp DC off-line, unplugging works better since some services like DNS and WINS don't like the NIC disabled.
  6. Seize all FSMO roles from SBS machine. It will complain and appear to error, but this is normal.
  7. Use ntdsutil to remove all SBS references in metadata.
  8. Remove all SBS references in DNS, and WINS. This is where Jeff's tools shined, and saved me lots of time.
  9. Unplug NIC on SBS box, and bring temp DC online.
  10. Enjoy an SBS-Free zone.

In this scenario, the temp DC can be a temporary, or permanent location for AD to reside. It's really up to you. Just keep in mind that if you have your only AD DC virtualized, and it's your ONLY DNS server, that you realise you may have issues with users logging in, or other DNS related problems until your virtualized DC comes up.

For this project, we opted to install a small 1U 2008 AD server, that holds all FSMO roles.

A couple of things to note for me was:

  • If you have other AD DNS servers in the environment, remove DNS from them, otherwise you'll be cleaning references to the old SBS off them, and causing bigger issues for yourself when you bring your temp DC back online.
  • Double-check your login scripts for references to your old SBS box.
  • Check printers and other devices that have static IP addresses to ensure those settings reflect your environment changes.
Reblog this post [with Zemanta]

Wednesday, May 26, 2010

LastPass Enterprise

The Environment

computer securityImage by justonlysteve via Flickr

I work in an industry that deals with Protected Health Information (PHI). According to the Health Insurance Portability and Accountability Act (HIPAA) we need to apply certain measures to protect this information, and continue to improve those measures.

Over time our company has increasingly found itself turning to the Internet to retrieve information regarding patient visits, explanation of benefits, remittance advise, bank statements, etc. The list grows daily.

The Problem

As this list grows, so do the accounts (username & password, and sometimes a challenge question). Keeping track of these is becoming a daunting task, and an even greater one is maintaining standards on password strength.

In the IT support world, it is common to find weak passwords and poor account management practices:

The Solution

Let's face it. Most of us are lazy when it comes to security and creating meaningful passwords. In defence of "us" I'll be the first to say it is a PITA to remember a litany of passwords, let alone having to mentally dig up or create one that is actually complex.

LastPass is a giant step forward to solving this part of "The Problem". It is quite literally one of the slickest ideas I've come across to solve this issue for our environment.

It's got many things going for it:

It's CHEAP. Free for web and browser plugin version. $12 Per year to add the Mobile version to your phone. $24 Per year per person for the Enterprise version.

Customer Service is superb.

Features for the free version:

  • ONE MASTER PASSWORD. Of course, you will want this password to be VERY STRONG.
  • Automatic Form Filling.
  • One Click Login.
  • Synchronize across browsers.
  • Secure ANY type of text data.
  • Share your passwords with friends.
  • Export/Import from many different password repositories (Firefox, IE, KeyPass, etc.)
  • Generate more secure passwords.
  • Backup your passwords.
  • Identities separate personal from others (like work).
  • Screen keyboard to further protect your master password from keyloggers.

As noted above, if you purchase LastPass, you also get LastPass Sesame to turn your thumb drive into a MultiFactor device, and the Mobile version.

Enterprise Implementation

I have a rather ordinary Windows Domain, with a Terminal Server and a number of XP workstations. I wanted to enable the automated login process noted on their Enterprise pages.

What I had to do was add this line to a batch script that was added to a Group Policy that applied to a group of target computers. Specifically in Computer Configuration -> Windows Settings -> Scripts -> Startup :

lastpassfull.exe --userinstallie --userinstallff --userinstallchrome  --installforallusers -j "%PROGRAMFILES%\LastPass"

The next step was to get the user logged in automatically, and have no knowledge of the master password. The reason for this is to keep them out of these accounts when not on site, protecting as best we can the PHI. This is achieved by the following line run by the users login script, which runs under their security context :

"%PROGRAMFILES%\LastPass\lastpass.exe" -cid=12345678

For those in management that needed access outside the building, I simply didn't add this line which forces them to log in manually.

Reblog this post [with Zemanta]

Monday, May 10, 2010


File Transfer GraffitiImage by Micah68 via Flickr
I've been using FileZilla FTP server for some time now and have been happy for the performance.
Recently, we needed the ability to expose the FTP service to another client, and the documents that we'd be receiving would be arriving in an un-encrypted form, unlike our other clients.
I decided I could simply enable FTPS, the SSL enabled FTP protocol and open a port to 990 on my ASA 5525 Security Appliance and NAT traffic to our server. Unfortunately I quickly found out that a passive FTPS server behind my firewall won't work without some specific configuration changes as discussed in this article.
With all that fussing around, I decided to check out freeFTPd, a single deamon that offers both FTP and SFTP, not to be confused with FTPS, but the secure file transfer protocol that is common to the SSH (secure shell) protocol.
It's fairly straight forward, but is a bit quirky and the documentation is non-existent. Follow some of my tips below to ensure a good working server, with the freeFTPd starting reliably as a service.

GUI vs Service

  • The SERVER is the state used when starting FTP and SFTP via the GUI.
  • The SERVICE is when FTP and SFTP is started as a Windows Service.
The GUI does not reflect the current state of the service. It will only correctly report the state of the server if you used the GUI to start it. Your best bet is to use cmd, and netstat -an to check the state.

Apply Configuration Changes Often

The best tip is while you are using the GUI to configure the service, click Apply often, and ESPECIALLY after you start the service.
Evidently the last state the server was in is the one the service will restore it to. So if you had the FTP service stopped, configured home dir's for users, etc, etc, and clicked APPLY and THEN started the service, do not expect your FTP server to be started for you when your server reboots.

Don't Rely on Windows Service

For some reason unknown to me or others, the freeFTPd service doesn't start reliably upon windows restart for some of us.
Instead, set this service to start Manually instead of Automatic, and use something like the following in a batch file to start your service a bit late, and let you know if it failed if you've got IIS SMTP service installed somewhere.

:: //////////////////////////////////////////////
:: Set the log file location
@SET _LOG="C:\Program Files\freeFTPd\ftpstartup.log"

ECHO ------------------------------------------------ >> %_LOG%
ECHO -- START %DATE% - %TIME% -- >> %_LOG%
ECHO ------------------------------------------------ >> %_LOG%

:: //////////////////////////////////////////////
:: Write the sleep operation to the log and sleep
ECHO Sleeping 30 seconds >> %_LOG%

:: //////////////////////////////////////////////
:: Start the service and log it
ECHO Starting service >> %_LOG%
NET START freeFTPDService >> %_LOG%

:: //////////////////////////////////////////////
:: Look for the services listening on our ports
ECHO Looking for FTP Listener... >> %_LOG%

netstat -anp TCP | findstr /R /C:"[ ]*TCP[ ]*[ ]*"

netstat -anp TCP | findstr /R /C:"[ ]*TCP[ ]*[ ]*"

:: //////////////////////////////////////////////
:: If this fails, log it and send a notification
ECHO #### %_MSG% >> %_LOG%

:: //////////////////////////////////////////////
:: Set the temp file location

:: //////////////////////////////////////////////
:: Echo the basic headers to the temp file

ECHO TO: "Croson, John" ^<mine@DOMAIN.COM^> > %_TEMPMAIL%
ECHO CC: "Demarais, David" ^<his@DOMAIN.COM^>,"Hayssen, Jill" ^<hers@DOMAIN.COM^> >> %_TEMPMAIL%

:: //////////////////////////////////////////////
:: Echo the blank line that separates the header from the body text


:: //////////////////////////////////////////////
:: Echo the body text to the temp file

ECHO Check %_LOG% for details.>> %_TEMPMAIL%

:: //////////////////////////////////////////////
:: Move the temp file to the mail pickup directory



From start run, open mmc, add/remove snap-in, and add the Group Policy Object Editor for the local computer. Go to Local Computer Policy --> Computer Configuration --> Windows Settings --> Scripts (Startup/Shutdown). Open the startup script and add the file you saved above. Apply the setting.
Keep an eye on this log to make sure your service starts. You may have to tweak the sleep time to get this to work. This works well for me on a Windows 2000 Server SP4.

Mapped Drives

I've configured two users. One I can get to use a mapped drive on the server (H), and the other I cannot (Z). Might be the letter, but I was able to work around that by using UNC (\\server\folder). Your mileage WILL vary.
Hope this helps someone else scratching their head as hard as I was!
Reblog this post [with Zemanta]

Friday, March 05, 2010

Dashboard Samples

Just finished posting two examples of the dashboards I've been creating for a few of our clients.

A Peek Inside: VW Bus Dashboard 1Image by cobalt123 via Flickr

One is a Visit Log, and the other is a Demographic Map.

Demo went well with one of our clients, they appeared even a bit exited at the possibilities it represents for them...

I'll start working on exposing other data, like Relative Values of services provided, attached receipts, etc.

Reblog this post [with Zemanta]

Friday, February 05, 2010

AllScripts Tiger to MS SQL to GoDaddy Hosting

Some time ago, I wrote an article about using Cognos Impromptu to export data to CSV from a then Misys, now AllScripts Tiger data source, and use DTS in MS Sql Enterprise Manager to import it for displaying data in ASP pages.
I started using this method to export other types of data, in an attempt to create a dashboard for our clients. I quickly found that Impromptu is not well suited or reliable enough for scheduling multiple jobs.
The requirements and caveats for using Impromptu in this manner are:
  1. Each report must be scheduled so as to not be running when the next is scheduled. Impromptu will not allow the next scheduled report to wait too long, and this may cause the "chain" of reports to fail.
  2. The workstation running the reports must have a user logged in with the Cognos Scheduler application running in order to fire the reports off. It cannot run via the Windows Scheduled Tasks (AT for us old-timers).
  3. If you run your reports against multiple companies, as we do, a macro will be required to be run to switch from company to company between reports.
Since I need high reliability and speed, I decided to investigate the option of using the installed Transoft ODBC drivers and Visual Basic ADO to retrieve data. The Transoft documentation was a bit sparse on this method, and eludes to a sample application bundled with the drivers, but sadly AllScripts saw fit to strip these out of their AllScripts Query installer package.
What I discovered was that it was quite simple to connect to the Micro Focus Cobol Server Express data files. A DSN connection string is created for each company we connect to for AllScripts Query, so the following is a sample of how to retrieve all Insurance Plans for a given company:
Dim Conn As ADODB.Connection
    Dim Rs As ADODB.Recordset
    Dim oFile As Scripting.FileSystemObject
    Dim strSql As String
    Dim strConn As String
    strSql = "SELECT T1.PLAN_INS_NUM c1 , T1.PLAN_NUMBER c2 , T1.PLAN_NAME c3 , T1.PLAN_ADDRESS1 c4 , T1.PLAN_ADDRESS2 c5 , T1.PLAN_CITY c6 , T1.PLAN_STATE c7 , T1.PLAN_ZIP c8 , T1.PLAN_PAT_TYPE c9 , T1.PLAN_NOTES c10 , T1.PLAN_PHONE1 c11 , T2.P_NAME c12 from P01 T2, INS_PLANS T1 Where T2.P_NUMBER <= T1.PLAN_NUM_KEY order by 1 asc , 2 asc"
    strConn = "DSN=Company_" & strCompany & ";uid=MyUid;pwd=MyPass"
    Set Conn = New ADODB.Connection
    Conn.Open strConn
    Set Rs = New ADODB.Recordset
    Rs.CursorLocation = ADODB.CursorLocationEnum.adUseServer
    Rs.Open strSql, Conn, adOpenForwardOnly, adLockReadOnly
    Set oFile = New Scripting.FileSystemObject
    Dim ouFile
    Set ouFile = oFile.OpenTextFile("\\MyServerPath\" & strName & "InsurancePlan.csv", ForWriting, True)
    ' Write the header
    ouFile.Write Chr(34) & "insurance_number" & Chr(34) & "," & _
                    Chr(34) & "plan_number" & Chr(34) & "," & _
                    Chr(34) & "name" & Chr(34) & "," & _
                    Chr(34) & "address1" & Chr(34) & "," & _
                    Chr(34) & "address2" & Chr(34) & "," & _
                    Chr(34) & "city" & Chr(34) & "," & _
                    Chr(34) & "state" & Chr(34) & "," & _
                    Chr(34) & "zip" & Chr(34) & "," & _
                    Chr(34) & "default_pt_type" & Chr(34) & "," & _
                    Chr(34) & "notes" & Chr(34) & "," & _
                    Chr(34) & "phone" & Chr(34) & vbCrLf
    Do While Not Rs.EOF
            ouFile.Write Chr(34) & Trim(Rs.Fields.Item(0)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(1)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(2)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(3)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(4)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(5)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(6)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(7)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(8)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(9)) & Chr(34) & "," & _
                            Chr(34) & Trim(Rs.Fields.Item(10)) & Chr(34) & vbCrLf

The company number is set via a required argument to the program I've written, as well as the report type, as I have several sub-routines that determine the type of data to dump.
You can refer to the article mentioned above to review how to import via DTS (Data Transformation Services).
My data is sent to GoDaddy (ya, ya, ya...some day we'll get a real hosting company) via the Microsoft SQL Server Database Publishing Wizard. It is script-able, and this is the way I use it today. You can get it directly from CodePlex, or review GoDaddy's knowledgebase about it.
Running sqlpubwiz.exe help publish will give you a list of command line options. You may use these to form your connection manually, or you can take the lazy route like I did, and run the GUI. This will allow you to create your connections in a permanent manner, easily referenced by aliases in your script. You'll want to remember to include the path to the program in your PATH environment, or adjust your scripts accordingly.
There is a nice article at GoDaddy for the steps required to publish your database, accompanied by nice screenshots. You'll want to review this article to determine what can and can't be uploaded, and database space limitations. One of the things you won't find, is information regarding what I found; even though the data I attempted to send was described as being 156mb in size by running EXEC sp_spaceused on the table I intended to upload, GoDaddy will not allow it to be uploaded, because combined, the two exceed the 200mb limit, and the destination isn't dropped before the source gets uploaded. This required me to make some modifications to my data size, and what I needed to do before I uploaded it.
The biggest hurdle for me was to determine how to form my queries, as I had no intimate knowledge of the database schema. This was quickly remedied by Impromptu. After running a report, go to the Report pull-down menu item, and select Query.
In this window, select the Profile tab, and the SQL option. This will reveal the SQL query used to retrieve your data. You'll notice that if you've got some sophisticated grouping, sorting or calculations happening in your report that they aren't necessarily represented here. That's because there is a limited number of built-in functions in legacy C-ISAM "databases", and Improptu handles much of that, sometimes at a high processing cost to your workstation.

Your best approach is to strip all your grouping, sorting and calculations out of your report, then copy and paste this SQL into Transoft's Win U/SQLi32 to verify you are capturing the data you want. Then you can massage your data a bit in VB, during the DTS import process, or ultimately when you regurgitate your data for other uses, because you will find that there are only a few SQL functions available through Micro Focus Cobol Server Express.

Image representing amCharts as depicted in Cru...
Image via CrunchBase
In my case, that would be through ColdFusion and the excellent amCharts flash charting product. These charts can take much of your dry practice management data and turn it into a more vibrant form of information.
I'm using this in conjunction with the Google Maps API to help our practice managers and physicians better understand where their visitors are coming from, and other demographic information.
Enhanced by Zemanta