Consider an AI that wants to build a copy of itself, but doesn't have physical access to the hardware that it's currently running on. (It does have remote sensors and effectors.) It has to somehow derive an outside view of itself from the inside view. Assuming that the AI has full access to its own source code and state, this doesn't seem to be a hard problem. The AI can just program a new general purpose computer with its source code, copy its current state into it, and let the new program run.
What if a human being wants to attempt the same thing? That seems impossible, since we don't have full introspective access to our "source code" or mental state. But might it be possible to construct another brain that isn't necessarily identical, but just "subjectively indistinguishable"? To head off further objections, we can define this term operationally as follows: two snapshots of brains are subjectively indistinguishable if each continuation of the snapshots, when given access to the two snapshots, can not determine (with probability better than chance) which snapshot he is the continuation of.
Given the above, we can define "to communicate qualia directly" to mean to communicate enough of the inside view of a brain to allow someone else to build a subjectively indistinguishable clone of it.