We make many compromises in IT.
Don't make things worse. Don't break more things to accommodate one broken thing.
On one project in my history, we had an elderly Windows application that used some data files. The design of the application was such that the data files had to reside within the same directory as the executable. This design varies from the typical design for a Windows application, which sees data files stored in a user-writeable location. Writing to C:\Program Files should be done only by install programs, and only with elevated privileges.
Fortunately, the data files were read by the application but not written, so we did not have to grant the application write access to a location under C:\Program Files. The program could run, with its unusual configuration, and no harm was done.
But things change, and there came a time when this application had to share data with another application, and that application *did* write to its data files.
The choices were to violate generally-accepted security configurations, or modify the offending application. We could grant the second application write permissions into C:\Program Files (actually, a subdirectory, but still a variance from good security).
Or, we could install the original application in a different location, one in which the second application could write to files. This, too, is a variance from good security. Executables are locked away in C:\Program Files for a reason -- they are the targets of malware, and Windows guards that directory. (True, Windows does look for malware in all directories, but its better for executables to be locked away until needed.)
Our third option was to modify the original application. This we could do; we had the source code and we had built it in the past. The code was not in the best of shape, and small changes could break things, but we did have experience with changes and a battery of tests to back us up.
In the end, we selected the third option. This was the best option, for a number of reasons.
First, it moved the original application closer to the standard model for Windows. (There were other things we did not fix, so the application is not perfect.)
Second, it allowed us to follow accepted procedures for our Windows systems.
Finally, it prevented the spread of bad practices. Compromising security to accommodate a poorly-written application is a dangerous path. It expands the "mess" of one application into the configuration of the operating system. Better to contain the mess and not let it grow.
We were lucky. We had an option to fix the problem application and maintain security. We had the source code for the application and knowledge about the program. Sometimes the situation is not so nice, and a compromise is necessary.
But whenever possible, don't make things worse.
No comments:
Post a Comment