What kind of applications aren't easily possible without a central message bus / knowledge store? Or rephrased as what kind of applications would benefit very much from the use of a central message bus / knowledge store.
To answer this, let's see how IPC usually works between applications: If a program wants to communicate with other parts of the system, there are several options, including writing data to a regular file, pipe/fifo, or socket, as well as directly executing another program. In some cases the communication style is uni-directional, in some bi-directional.
So depending on the setup you can pretty much exactly define how messages are routed between components. However, the details of this communication are hidden from the outside. If you wanted react to one message from another part of the system, you're out of luck. This direct coupling between components doesn't lend itself very well to interception and rerouting.
Unless the program of your choice is very scriptable, you then have no good way to e.g. run a different editor for a file to be opened. Since programs typically don't advertise their internal capabilities to outside use (like CocoaScript (?) allows you to a degree), you also don't have a chance to react to internal events of a program.
Proposed changes to browsers would include decoupling of bookmark handling, cookies/session state, notifications and password management. Additionally it would be useful to expose all of the scripting interface to allow for external control of tabs and windows, as well as possible hooking into website updates, but I think that part is just a side-effect of doing the rest.
Proposed changes to IRC clients / instant messengers would include decoupling of password management and notifications. Additionally the same argument to expose channels/contacts/servers to external applications applies.
Now let's take a look at the knowledge store. In the past I've used a similar Blackboard system to store sensor data and aggregate knowledge from reasoners. The idea behind that is the decoupling of different parts of the program from the data they work on, reacting to input data if necessary and outputting results for other programs to work on.
I imagine that this kind of system relieves programs from creating their own formats for storing data, as well as the need to explicitly specify where to get data from. Compared to a RDBMS to downside is obviously the lack of a hard schema, so the same problems from document based data-stores apply here. Additionally the requirement to have triggers in order to satisfy the subscriptions of clients makes the overall model more complex and harder to optimise.
What is then possible with such a system? Imagine having a single command to switch to a specific buffer regardless of how many programs are open and whether they use a MDI or just a single window. In general scripting of all running programs will be easier.
The knowledge store on the other hand could be used to hold small amounts of data like contacts, the subjects of the newest emails, notifications from people and websites. All of that context data is then available for other programs to use.
Assuming such data was then readily available, using ML to process at least some of the incoming data to look for important bits of information (emails/messages from friends/colleagues, news stories) can then create an additional database of "current events". How this is displayed is again a different problem. The simplest approach would simply be a ticker listening on a specific query, the most complex would maybe consist of whole graphical dashboard.
Security is obviously a problem in such a share-all approach. It should be possible though to restrict access to data similarly to how user accounts in regular DBM systems work and for scripting interactions the system would still have to implement restrictions based on the originating user and group on a single system, as well as the host in a distributed environment.