Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For what it's worth, here's one counter-example:

I've recently been working with a team that is developing a content management system for which the display side of things (responsible for routing requests to models and views, fetching model content, rendering, etc.) is about 500 lines of non-comment CoffeeScript code.

Nearly 100% of the server-side code is re-used on the client, and perhaps 80% of the (non-library) client-side code is shared with the server.

In this case the exceptions are:

- Things like HTTP-level request handling and reading from local files rather than over HTTP are used on the server-side only.

- Things like DOM-manipulation and interacting with browser events are used on the client-side only.

I wonder if a significant factor may be how much functionality you're actually replicating on both the client and server. In our case, browser-permitting, the entire display engine runs equally well on the client or on the server. If we had a lot of JQuery kind of stuff happening in the client-side JavaScript the ratios might be a little different, but "also-run-the-app-on-the-client" is a good example of a use case that leads to a lot of reuse of the server side code.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: