ISPS offers custom software consulting services with a specialty in telephony integration projects. My business partner and I founded this company and a related venture.
TSPS is a technology company catering to the trucking industry. They create information services and hardware to aid in planning and managing truck parking, among other things. I was hired initially to focus on developing front-end customer-facing user interfaces. I worked there until they ran out of funding, let go of all paid staff, and ceased operations.
Stealthbits and its parent Netwrix create cybersecurity software for mid to large corporations. I worked on two major products for Netwrix since starting. Although I did full-stack work I focused primarily on improving and expanding the user interfaces of both because of my strong web and design skills.
Amphion provides medical transcription and coding services for hospitals and like facilities. They hired me to help them replace their leased software systems with an in-house alternative and to position them as a transcription technology provider for their customers, too. My key role was as a .Net (C# and VB.Net) programmer and Sql Server expert.
Pharmacy OneSource, or "P1S", sells hosted web-based software that hospital pharmacies use to deal with compliance and cost saving measures. They hired me on to raise the level of sophistication of the apps. My key role was as the UI architect. I developed a new framework with a strong emphasis on DHTML / AJAX client-side behavior. To ensure quality, I was integrated into the first major application built on it and served as trainor and mentor to a rapidly growing development team.
An old colleague and I cofounded ISP Services. We decided to build an organic marketing platform to attract ISPs by offering a national directory of all ISPs so anyone could find out exactly which ones offer service to their homes or businesses. I built the entire system in Node.js, TypeScript, Postgres, and HTMX.
An old colleague and I cofounded ISP Services. We originally were doing custom consulting work for a vendor of an ISP billing system. They encouraged us to develop an off the shelf system for ISPs to add to their websites to make it easy for their own customers to learn about their services plans and then sign up online for them. I built the entire system in Node.js, TypeScript, Postgres, and HTMX.
ISPS had a client that was in the process of switching their billing system to Sonar. They had been using an old Access database to manage their customer workflow. We modernized it as a web app.
ElephantTalk was founded by an old colleague who invited me to join him. The principal goal was to develop an alternative way for smaller ISPs to offer telephony services to their broadband customers without having to implement and manage the expensive infrastructure required to meet regulatory requirements. The founder had a long history in the ISP and telephony market. He developed the backend core services. He hired me primarily to develop a complete user interface for administrators, ISP clients, and end users of their telephony services. I built the UI primarily using React with TypeScript.
Snuggle Hamster Designs (snugglehamster.com) was a pet project I and a fellow graphic designer embarked on to create and sell unique designs on posters, coffee mugs, and other items via the print on demand (PoD) service Zazzle. To help promote our products and make them easier to find and organize, I created a website with SEO in mind using Next.js. The site was built to help other Zazzle designers to also promote their products as well to provide us with an additional revenue stream.
TSIS is a SaaS product for the trucking industry. It brings together owners of parking lots and trucking fleets in one place so drivers and fleet managers can find and reserve spaces. Similar to Expedia. I was hired during a very busy time when the principal developer and product owner needed to focus on other projects. Leaving me for six months with very little direction to continue fleshing out the system according to what he had set in motion before I joined. I brought a project that had been stalled for a few years to near its beta go-live. In the latter six months the principal developer returned, along with a third developer, to add a lot more functionality. We brought the product to the beta testing stage with a first key client. I took on many development responsibilities. Including creating a sophisticated multi-tenancy Stripe integration allowing our customers to manage their own payment accounts. And creating an invoicing system from scratch. And working with a longtime graphic artist to revamp the UI framework and app styling. And so much more.
SbPAM is cybersecurity software that provides task-based administrative access to Active Directory users and groups within a corporate network to servers, websites, and other resources. Security administrators define well-known tasks and the servers those tasks involve. When a user wants access to one of those servers they make a request through SbPAM, which can either let them in immediately or forward the request to a security admin for approval. The user's entire session gets recorded for later review. And most importantly the user's access is strictly limited to the duration of that session. I was one of the first developers to join the project after its initial one-person R&D phase. Although I worked on all aspects of the software I focused largely on improving the user interface because of my strong UI skills.
StealthIntercept is cybersecurity software that audits and blocks attempted Active Directory (AD) changes and other requests. In doing so it makes it easier to detect and limit typically automated attacks focusing on AD. AD is often the primary target of penetration attacks. I was not a primary developer on this project. A next-gen version was in the works. I was asked to develop a starter project using the same React and C# foundation found in SbPAM and StealthDEFEND. I also provided extensive mentoring to several developers working with that seed.
StealthDEFEND is cybersecurity software that collects system events from Active Directory, select servers on a network, and other sources and identifies potential threats from defined and emergent patterns in the stream of events. I worked as a full-stack developer on the project. Although I focused largely on improving its existing user interface because of my strong UI skills.
I decided to improve my experience with Node.js and MySQL by replicating the application framework and pattern I've used repeatedly for enterprise projects using an ASP.NET and Sql Server stack.
Goals
After over a decade of web work I had settled into a strategy for larger web applications oriented toward rapid development and high performance. All of my implementations have used a thin ASP.NET middle tier that contains almost no business logic. Instead almost all business logic has resided in stored procedures in SQL Server alongside application data. Almost all UI code then resides in static JavaScript and CSS files downloaded to the client. The middle tier only exists as a sort of pass-through that glues the browser client to the database. Browser-accessible SPs all have a special prefix and rely on common procedures to limit access and otherwise prevent hacking.
Because the middle tier is really a thin "glue layer" I reasoned that it could be implemented using any significant combination of programming language, operating system, and web server. To that end I decided to try crafting a pure Node.js (not Express) middle tier version with all the hallmarks of my ASP.NET implementations. At first I had it connecting to a Sql Server database. Then I modified it to connect to a MySQL server.
In the same vein I reproduced the essential framework components in MySQL that I typically include in Sql Server. When a page request is received one central stored procedure is called that finds the appropriate page SP and metadata about it needed to craft a call to it. In Sql Server implementations that first SP constructs a dynamic SQL script and executes it directly. MySQL imposes stiff constraints on dynamic SQL. So I decided to let that SP return its findings and have the Node.js script make that second SP call.
I honed this application design pattern over about a five year period for one client. I was able to reproduce most of it in about a day using Node.js and MySQL.
In cooperation with a 3D modeler and a YouTube videographer and promoter, I created a fast-paced VR team sport game in Sansar. Players rode hoverboards and fired projectiles at the opposing team to score points. HoverDerby was the very first real sport in Sansar. It gained attention via Forbes magazine and other media coverage, including our own weekly YouTube-televised competitions. HoverDerby pushed Sansar to its technical limits and served as a critical test case for Linden Lab as they introduced new interactivity features to Sansar.
In Sansar I created a rich dynamic motion system that enabled experience designers to build complex machines out of static objects that could "dance" around in very complicated patterns. Users could program the sequences in a way inspired by music composition. I created dance club-like spaces with over a hundred moving lights and robot-like elements all keeping perfect time with included dance music. I and others used Clockworks to build VR interactive stories with animatronic-like robots playing various roles.
When Sansar opened its doors to the public in 2017 there were already over a thousand experiences created by residents. Most of them were still static, lacking any interactivity for visitors. I joined before the open beta as a "scripter". I introduced some of the first modular scripts to enable 3D modelers to add interactivity to their experiences and products under the "Reflex" brand name. Reflex underwent 4 totally rewritten versions and introduced over 50 off-the-shelf components. Later versions also included JSON-based integration with off-world web services.
This was largely an independent research project with an eye toward a practical product. I evaluated and implemented a variety of machine learning algorithms. I took a deep dive into human language structure and learning. I built algorithms for morphological parsing and English segmentation and parsing. One practical goal for this project was a suite of tools to help authors analyze and improve their storytelling and for literary agents and publishers to find and evaluate new works.
The client was opening a cold storage business and had no existing software for management of it. My task and that of my other developers was to create an entire new ERP system from scratch. I created entire UI, middle tier, and data layer frameworks and led the design, programming, testing, and deployment efforts. We built many technical features, including graphical visualizations, an engine for rich printables like labels and forms, a Google-style data searches, detailed permissioning and auditing, and much more. We created application features that enabled them to manage their storage capacity and inventory and empowered their customers to characterize and monitor their own inventory remotely. We created an entire contract and order management capability.
Amphion's medical transcription system's primary output to customers is printable documents, often handouts for patients. The company decided to replace the third-party rendering engine with a proprietary one. I defined a custom templating language with support for merging data from HL7 CDA source documents, including Amphion's embedded proprietary metadata. I designed and coded the template editor and rendering engine.
The client had been using a specialty line of wireless barcode scanners with numeric keypads from Symbol that were no longer being manufactured and thus growing more costly to replace. Motorola, which bought Symbol, offered a modern replacement that was actually a complete Windows CE computer married to a similar wireless barcode chassis. My task was to develop a custom application to run on the device and a proprietary service host program to run on each client machine. That program incorporated massive speed improvements over the alternative options. It also allowed the user to instantly pair any gun — or multiple of them — to any terminal computer in their warehouse by scanning a barcode on the terminal. The Windows CE program suite featured automatic software upgrading, custom power management to extend battery life, and self-healing of failured caused by design flaws in the device.
The client was relying on a software system that was over forty years old and needed modernization. My task and that of my other developers was to create an entire new ERP system from scratch. I created entire UI, middle tier, and data layer frameworks and led the design, programming, testing, and deployment efforts. We built hundreds of features, including graphical visualizations, an engine for rich printables like labels and forms, a Google-style data searches, detailed permissioning and auditing, and much more. We produced countless screens and features for operational and management use by all departments, including operational features for catalog, inventory, storage, manufacturing, shipping, and accounts payable and receivable management and extensive tools for monitoring long term trends and projecting future needs.
Amphion's medical transcription and coding businesses relied heavily on a fleet of separate programs that communicate largely by consuming and creating data files. Any program failure typically results in a backup in the vast "pipeworks". I created a system of windows services to sit on all related servers and watch for failure conditions in the flows and report on status and errors via a central server. The administrative user could configure and monitor conditions via a web site.
The Triton transcription system matches audio dictations with patient / encounter / order information sent via standardized HL7 messages from hospitals' medical records systems. Amphion needed a replacement for its older, rigid, overly complicated one. I created a reusable library and corresponding Windows service to import raw HL7 files to populate Triton's "demographics" database.
Goals
Like many medical software systems, Triton, a medical transcription system, relies heavily on patient "demographics". When a doctor records an audio file representing a report about a patient, that patient has already been checked into the facility and an "encounter" record created for the visit. Amphion has the facility send us that patient/encounter/order information sent to Triton using the nearly ubiquitous HL7 standard format. The physician typically keys into the phone or other dictation system a medical record number, date of birth, encounter ID, etc. that roughly uniquely identifies the patient or encounter. The goal is that when the transcriptionist starts her work translating the audio into text, she'll already have the patient demographics associated with that dictation record so she doesn't have to enter that information.
An HL7 message can represent many kinds of events that can occur in a medical system. Its structure is roughly a 5 level hierarchic flat file. It may contain key information about the related patient (e.g., name and date of birth), the encounter (e.g., visit ID and discharge date), the order (e.g., details of a CT scan requested), and so on.
The service I created is responsible for creating new records and merging in changes to existing ones. Although the HL7 standard does provide a lot of consistency, most systems that support it have wide variation in how they choose to represent data. Further, facilities can customize HL7 mappings. The result is that each facility requires a unique set of mappings into Triton's demographics database. The system I designed allows for a shared set of mappings that all facilities inherit and the ability for administrative users to define custom overrides by facility. Further, each mapping can call out to custom scalar value functions (SVFs) in the database to do additional formatting and lookups as needed. As a result, nearly 100% of the mappings, including custom code, are represented purely in the database for maximal transparency and self-documentation.
I was responsible for developing the entire service and a series of Intranet based utilities for letting administrators edit HL7 mappings, search for historical messages, view HL7 messages in parsed detail, simulate import of any HL7 message, and reprocess any import.
Triton is the proprietary system used primarily by professional medical transcriptionists to turn audio recorded by physicians into printable textual documents. Amphion had used an existing system from an outside vendor for years and had paid an outside consulting company to craft a replacement. I and other in-house staff were tasked with improving and integrating their prototype with our existing infrastructure. My most significant contribution to the user interface of Triton was in improving overall performance at least two orders of magnitude and the specific performance of key search pages even more.
P1S had a need to allow users of its web apps to upload and "attach" their own files (e.g., Word documents) to database records. The typical solution of simply storing them on public web folders was not sufficiently secure or scalable, so I developed a database-centric component that can be easily plugged into its web apps.
Goals
Uploading files through the web is not new, but there are many approaches to the question of how to store and retrieve such files. One simple way is to store uploaded files in a folder somewhere under a web site's root folder (e.g., "/MyWebApp/Attachments"). One problem with this is that it is insecure by default. Anyone who knows the name of a file can retrieve it. Storing the binary contents of a file in a database solves this problem because the web app can serve as a gateway, controlling who sees which file.
Another problem with a public web folder is that it doesn't by default work with a web server cluster. One has to get or create a special service to copy every uploaded file to every other web server in the cluster or choose to store files on a dedicated "attachment server", which itself doesn't have a proper backup. Storing binary file content in a database solves this problem by making the same file contents available to all web servers in a cluster.
Storing files in Sql Server comes with several risks. For one, it tends to bloat database files quickly. A single JPEG may contain more data in it than hundreds of database records otherwise. This can become a problem with large databases. FileDump solves this by allocating a dedicated FileDump database catalog that is separate from the multiple catalogs that are used with the various apps that used FileDump.
Another risk is that it can be complex to deal with all the idiosyncrasies of transporting binary file content between a web interface and a database. FileDump solves this by encapsulating all of that in simplistic save and view methods. One simple example is that FileDump looks at a file's extension when the code wants to push the contents back to a calling web browser and decides which of hundreds of MIME file types to pass along, a key clue for the browser to decide how to handle the contents. Another example is the streaming model, which relies on a small buffer size to give users the ability to consume content even before it's fully downloaded, while minimizing the amount of memory used by the database or web server, even with gigabyte-size files.
In anticipation of the need for versioning of uploaded files, FileDump makes it easy for application code to choose which of 4 major version models (append, hide, overwrite, delete) to use with newer versions of uploaded files.
Although the code component was made with ASP.NET apps in mind, I made a point of putting most of the business logic in Sql Server stored procedures. This makes it easier in the future to develop mirror versions of the code component in other languages.
P1S needed to deliver graphical charts through its main applications' web interfaces. I was tasked with selecting an off the shelf component. While we were able to find good ones, their pricing models would not have made financial sense with P1S' scalability model. I developed a reusable charting component from scratch to suit P1S' existing and near-future needs.
Goals
Most of the better off-the-shelf charting components cost a few hundred dollars per machine they are installed on, but our scalability model called for putting clients on groups of isolated application clusters, each of which had multiple web servers. This would have meant spending many thousands of dollars up front and ongoing as we brought new clusters online. Creating our own component would save P1S money up front and in the long run.
In addition, most of the better charting components were designed to directly connect to Sql Server to fetch data, which did not fit our goal to funnel all database requests through apps' middle tiers. They also often feature XML interfaces, which also didn't fit our model very well. I kept WebCharts' interface very object oriented and simple, especially with respect to getting data into it. Our apps tended to favor explicit loops for data manipulation, so WebCharts' model suited that better than the alternatives.
One key requirement was the ability to mix bar and line charts and have two Y axes. Most off the shelf components didn't support this easily. WebCharts made it pretty trivial. WebCharts also made it very easy to explicitly control the axis, right down to the individual tick marks. One also had very fine control over the fonts and text for everything, from headers and legend entries right down to individual tick marks. One could, for example, set a single tick mark's label's font to be red and bold to stand out. This degree of flexibility even extended to the line and bar series and even individual data points, with the ability to customize the shape, size, color, and line style, transparency, etc. of every last point, line, and bar,
One failing of many commercial components is that they don't choose "smart" ranges. If the Y range for a line chart is 0.5 - 23.7, for instance, they usually will simply evenly divide that range into perhaps 10 lines at oddball intervals, whereas WebCharts by default always chooses ranges that are on clean orders of magnitude (0.1, 1, 10, 100, etc.), making the results easier for people to understand.
I also made a point of providing full documentation of the component and its entire object model. In addition to applying it to the Quantifi project I was working on, I also provided mentoring to developers applying it to other projects.
Quantifi was still P1S' cash cow product and one of its oldest. P1S no longer wanted to maintain the Java code base because new development was favoring ASP.NET, so they decided to start over with a second, "next generation" version using my Manya platform. I was the first developer beginning work on it, in part to prove out and debug Manya. The project grew to include 7 developers. I served largely as a mentor as new developers were added and engineered several key pieces.
Goals
Quantifi's Uis seemed deceptively simple at first, a reflection of a basically good design that hides a lot of complexity. Most data would be entered in a main "document editor" web page. But one key requirement would allow our customers to customize the document designs, which meant creating new fields, changing their order on the page, setting default values, and so on. This meant the document page had to be entirely dynamically constructed in real time based on the customer's specifications. Before the project officially got started and before there was a database to work with, I created this page's UI, mostly to exercise the Manya platform I was creating. I mentored other developers who took over afterward.
One requirement was to store file attachments users uploaded with document records. Because our scale-up approach involves multiple web servers, it was impractical (and insecure) to store those files on publicly visible folders in the web app. I developed a reusable "file dump" service and component to move uploaded files into a dedicated database and easily pump them out to web clients on request.
One key feature of this new version of Quantifi would be its "list engine", which would show documented events selected using filters users could define themselves. The database designer for the project laid out the appropriate tables and started work on the list engine, while I implemented the list editor. When she fell ill for a few months, I took over construction of the list engine, which translates a given list definition into a set of stored procedures (technically, UDFs) in the database, mainly for performance reasons.
Although lists were an end in themselves, providing users with a natural work flow for day to day processing, they also served as input for Quantifi's new reporting engine. The database designer for the project and I collaborated on splitting up the work. She made a database component to generate the raw aggregate values and I made C# code to construct an in-memory data structure, calculate item percentages, subtotals, and totals, and construct a generic XML representation of the final data. In addition, I created the report viewer, which featured both a cross-tab data grid and line chart of the data. I created the charting component from scratch, too.
Sentri7, P1S' flagship product, provided "surveilance" of patient records in a hospital pharmacy. It was mainly used to identify opportunities for cost savings, safety improvements, and regulatory compliance administration. One of its key features over those of existing competitors is the ease with which customers can develop their own filtered lists and reports. I was asked to improve the performance and user friendliness of the list designer tool.
Goals
Sentri7's competitors had good products, many of which were more mature, but most suffered a significant limitation: it was cumbersome to create new filtered lists. Or customers had to pay their vendors to do it for them. Sentri7 made it fairly easy to define filtering rules and custom reports based on them.
The original plan for the S7 list editor was to have a highly dynamic, DHTML based "rules wizard" that would allow users to add one or more rules, selecting what to filter by and setting the specifics for each rule. Unfortunately, the first version was becoming an unwieldy mess of code and was not cross-browser compatible, so they rushed to production with a pure post-back approach. That is, each selection in a drop-down list or the like would cause the web page to be submitted and redrawn. I was brought into the project to revive the DHTML version. I rewrote it from scratch. The result was cleaner and much more compact and, most importantly, cross-browser compatible. All population of data based dynamically on selections was done using the AJAX model: XML RPCs back to a web service.
P1S had acquired a suite of applications developed at different times by different teams using significantly different technologies. Having determined the platform they wanted, they hired me to develop a framework for proper application development by their small but growing team of software engineers.
Goals
Pharmacy OneSource had already taken a step into their goal platfom -- ASP.NET / Sql Server -- by developing their newest product, Sentri7. But they had recognized that there were several key problems with it and that it was stretching the limits of the developers. One particular shortcoming at P1S was experience with DHTML / JavaScript / AJAX.
Initially, they hired me to fill the experience gap with client side web development, but realized that they needed something more. So they allowed me to develop Manya, a new UI framework, and Sara, a middle-tier framework. Together, Manya and Sara ("Manya" for short) represented a novel programming model that favored small amounts of linear, transparent code that decoupled UI from middle tier, but without a lot of the overhead that often comes with N-teir frameworks.
One key distinction is the Manya control set. Unlike standard ASP.NET controls, Manya's controls all had client-side components, which made for very easy JavaScript coding when that was required. Many also had built in remote procedure call (RPC) capabilities for fetching data, performing validation, and more via AJAX calls.
Manya also did away with the complex post-back model of traditional ASP.NET in favor of a model-view-controller (MVC) pattern, with all requests being channeled to explicit "operations" (e.g., "OpSave") or RPC handlers (e.g., "RpcCheckEmailAddress").
One other key distinction was the use of a "common information currency" that automatically coupled input from web forms and output to controls in a mutable collection that could be easily passed into and out of the middle tier and be used in all parts of the system. The mechanism made it possible to encode 90% of and application's validation rules in simple "domain objects" -- classes that have nothing but properties adorned with validation rule attributes like is-required or maximum value -- and let these data collections suck in these validation rules for use by the middle tier (e.g., during saves) and even the UI to do things like automatically decide maximum lengths for strings, for example.
One key goal for Manya was to implement the design patterns and styles poineered by the company's main business analyst / UI designer. Coders didn't need to put much thought into these requirements because they simply added controls to screens in the order indicated by designs and the style was inherently there. That is, the constraints Manya placed on the developer made the company's design patterns and standards an implicit, automatic thing instead of required a detailed, explicit effort on the developer's part.
Manya introduced a strong set of disciplines and tools for developing clear and concise code. One of its best strengths, in my opinion, was its openness to alternatives. My mantra in training was that one could bypass just about anything in Manya if it got in the way, but one rarely needed to.
I've been participating in online chats, writing articles, and posting code I've developed to public web sites for years. The advice and training I give is to people with all measure of experience, from beginners to the most expert. The code I post is from code libraries and complete applications I've developed primarily in my spare time.
Goals
My goal is to educate other developers, to enrich the IT field with new techniques, to keep my own skills honed, and even to further build my own professional reputation.
I was the sole Software Developer
I am responsible for all the code and applications I provide and advice I dispense. I do, however, owe plenty of other people who have done the same for me a great deal of credit.
Among other applications and reusable components I've developed for free, public use are:
Non-volatile RAM engine
TCP/IP socket controls for client/server applications
VB demonstrations of how to create menu-like popup windows
Binary file viewer
Library to manage processes in Windows NT
Library of array manipulation routines, including splicing, conversion, and more
Library of string manipulation routines for parsing, translation, and validation
Postal address entry, parsing, and validation control
HTML editor with an automatically-updating preview
Summarizer engine that transforms a body of text into a summary of approximately any desired number of words
TCP/IP server simulator to aid client developers
Thumb wheel control for scrolling
A VB code parser, auto-documenter, and viewer
XML stream processor
Huffman compression algorithm
Among other articles I've published are:
Ad Hoc Data Structures (creating data structures "on the fly" and without formal definitions)
Introduction to [VB] Error Handling
Quick 'N' Dirty OOP Inheritance (VB doesn't natively support it)
An introduction to TCP/IP sockets and client/server programming
Using DoEvents to Make Your [VB] Apps CPU-Friendly
Basic Web Technologies
Whence the Process? (introduction to the three basic places processing can occur in client/server systems)
Windows NT / 9x / 3.x / 10 / CE / PPC / Pocket PC / Vista / Win95 / Win98 / Windows 3.x / Windows 95 / Windows 98 / Windows 9x / Windows CE / Windows ME / Windows NT / Windows Vista / Windows Mobile / Windows Server