Today, powerful suites of natural language processing tools can be used to construct rich, spaghetti-like networks capturing the knowledge and semantics of text. Somewhere among these hotchpotch instantiations of various linguistic theories is a wealth of elaborate interconnected structures that capture something fundamental about natural language. Everything is connected, but which connections are important and for what? In my research, I explore how to automatically extract what is useful from such representations towards solving structure generation problems while attempting to find solutions that are agnostic to particular linguistic theories or problem domains. I am particularly interested in the problem of generating globally semantically coherent text, as this seems to be what is currently out of reach from the current state of the art and where I feel rich representations are crucial.