Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are you and 'userbinator saying the same thing? I can't tell. I know how the simple symbolic stack->expression evaluation works, and it happens that in my code I generate something pretty close to SSA expressions, but does SSA do something else profound for decompilation?


SSA abstracts the stack away, and allows to reason about types much easier.


I'm not sure I'm following. To get from stack operations to expressions, I just symbolically evaluate the stack, creating temporary variables as I go. It happens that the resulting IR is pretty much SSA form. But I'm not taking much else from SSA. I'm wondering if I'm missing opportunities.


It's easier to transform your expressions into a useful form from a guaranteed, proper SSA than from a simple tree representation. For example, an induction variable extraction is totally trivial in SSA, and you really need do to it if you want to reconstruct nice looking `for` loops.

It also pays well to have distinct basic blocks - loop analysis is much easier then.


This is helpful. But I read it and think, for instance, "distinct basic blocks aren't SSA"; compilers worked in terms of CFGs before SSA existed. :)

Again this is more about my lack of confidence about fully grokking the implications of SSA; I'm not nerd-sniping.


Of course, you can have basic blocks without an SSA. It's just another feature that was missing from the article that was worth mentioning.

Another thing you'll get for free from an SSA - nice ternary expressions reconstructed (even if the original code was using ifs).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: