It's possible to turn the safeties off because they are a program. In just the same way one can turn off one's antivirus program while surfing the net, or set one's monitor to display only 16 colours.
But if it's a program that has the potential to hurt people, it's insanely irresponsible safety design to include the capability to deliberately choose to make it lethal. For instance, why is it even possible for holodeck guns to fire solid bullets? Bullets in flight are invisible to the eye, so there's no reason to include them in the program. Just simulate the muzzle flash and the impact, the way filmmakers do with blanks and squibs. Deliberately designing a holodeck program that's even
capable of simulating a lethal firearm just doesn't make sense.
You know what Murphy's Law really means? "Anything that can go wrong, will?" It's actually a statement of a basic principle of safety engineering. If your system is designed so that it's capable of going wrong in a particular way, then that is the way it will fail when a failure inevitably occurs. So good design means trying to make sure that your system doesn't have failure modes built into its design -- or at least to make sure it fails safe, i.e. that if something breaks, it defaults to a mode that won't hurt anyone rather than one that will. If you design a holodeck program that's capable of simulating a gun that fires solid, deadly bullets at living beings, then you've completely failed as a designer to take common-sense safety principles into account, and you'd be held liable for gross negligence as soon as someone died as a result of your hideously inept design.
But fiction
depends on things going wrong, so the demands of the storyteller are opposed to the demands of the safety engineer. Thus, systems in fiction tend to be deliberately designed to become more dangerous, not less, in the event of malfunction. I think perhaps we get so used to seeing that happen in fiction that we fail to realize how unrealistic and illogical it is.