World's most popular travel blog for travel bloggers.

[Solved]: Where can I find rules for source to source transformation optimization rules?

, , No Comments
Problem Detail: 

Upon reading Do source code optimizers exist? I knew that such programs existed but the ones I have worked with use a set of rules to drive a transformation algorithm. Ira Baxter provided a link to the tools running the algorithms with his answer but what I was more after from the question was the rules. In the comments Ira noted the Irvine Program Catalog.

Are there any more papers, sites, etc. that list such rules?

I will accept any answer but have a preference for rules used with functional languages.

Asked By : Guy Coder

Answered By : Ira Baxter

Martin Feather's thesis and papers (and similar theses from the 70s) are full of examples of such rules for functional languages. See http://dl.acm.org/citation.cfm?doid=357153.357154. (pdf) Most of the formal types like such (functional) rules.

More recent would be any work by Martin Ward using his functional language WSL including his thesis. See http://www.cse.dmu.ac.uk/~mward/martin/papers.

If I were you, I'd look to the Haskell community, which is always trying to optimize functional programs, and particularly to the Haskell compilers which implement many of these ideas.

But I get the impression you're looking for the the book of program optimization rules. There are some pretty standard ones based basic algebra as applied to integers, etc. And you can see a bunch of these coded explicitly for DMS in an example of how to define algebra to a program transformation system: Algebra as a DMS domain.

You aren't likely to find "the general book" for the following reasons:

  • Every syntax formalism in which rules are represented makes a choice about how to write down the details of the rules. Nobody has agreed on a universal notation; that's not a flaw of software engineering, the mathematicians haven't figured this out yet either. So any rules you find will be cast using a particular encoding scheme. Unless you get the tool that uses this encoding scheme, you probably can't use the rules. (See the algebra example for a specific encoding scheme.)
  • "Optimizations" are essentially about equivalent sequences of operations where one sequence is "cheaper" than another by some metric (smaller number of steps, less computation, less memory, less heat, less code, ...) The number of entities for which you can define operations is boundless: booleans, naturals, floats, strings, structs, crossproducts, linkedlists, arrays, arbitary compositions of the above, each with predicates possibly constraining them to a subset (eg., all naturals less than 10). The number of operations you define is equally boundless. Which ones are worth writing down?
  • A variation on the above theme is that the syntax and semantics of the programming languages you want to manipulate are egregiously different (did we really need both Java and C#?), or deeply semantically different (COBOL and Prolog, for example), or just outright egregious complex (consider C++11). So your rules need to be case in a form that is dependent on your target language syntax and semantics. And there are thousands of those.
  • Source to source transformations seem to offer nirvana, but they have limitations. In particular, most ways you can represent a transformation to be matched against code, what it can match has a limited radius of the match. If your optimization needs information from "far away", e.g., beyond the radius of what you can write in the rules, no single rule can realize your optimization. You can only succeed by composing more than one rule (and you can succeed spectacularly in theory because Post systems are Turing capable, and most rewrites are Post systems almost trivially). However, writing long strings of transformations to achieve an effect is much like writing long sequences of Turing machine steps: you really, really don't want to do that. So in practice transformation systems usually offer the ability to compose arbitrary code procedures, and source transformations, that make "bigger" source-to-source transformations. So now your real transformations are mixed code and rewrites. Using code from one of those thousands of programming languages. (oops).

My belief (uh-oh, opinion coming up) is that source-to-source transformation tools are really useful, because you can build program manipulation engines with them more easily (by a lot!) than with any procedural code or any other techniques I know about. The differences in syntax and semantics of the target languages means you are pretty much doomed to not find such transformations off the shelf (well, you can pray for C++ to become stable for 20 years, and then maybe you have a chance). So, what is needed is a system which will let you define the rules you need, for the language you want to manipulate, relatively easily, when you need them. (And I've spent the last 18 years of my life building such a system).

Good luck finding the universal ruleset.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/18583

3.2K people like this

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback