I recall in an early role in Financial Services, finding myself being exasperated by the constant re-emergence of a particular technology (and supplier!). This phenomenon has repeated itself several times since in much of my cost reduction and quality programmes.
The truth is applications that users love will constantly be found a new purpose and the battle to remove risky, homemade or non-core technology, which is not controlled, is a perpetual one. However, how many of us can afford to constantly review information flows so that we can uncover the odd spreadsheet or home-built database that has become critical to a process? Not many I would surmise.
There are a couple of obvious practical high-level solutions. The first is to have a rolling programme of monitoring (possible but expensive) or the occasional audit (common and well understood). The second is a bit more radical and that is to find a technology that the users love and find a mechanism where it is better controlled and understood, where training on the risks and the requisite declarations is given regularly and professionally.
The starting point though, is an inventory and a risk assessment against each. Can the databases and spreadsheets be replicated in a core application? Can they be locked down? Are they necessary at all....? Follow that up with rigorous training.
We may not be able to stamp out the problem (ever?) but we can certainly stop users deliberately putting down more muck to spread the menace!