Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not really seeing the advantage over modern JS async, perhaps you can explain your assertion?


Js:

  async function bar(){...}
  function baz() {...}

  async function foo() {
    var x = await bar();
    var y = baz();
    return x, y;
  }
Lua:

  function bar() return something_that_may_yield_deep_inside(...) end
  function baz() ... end

  function foo()
    local x = bar()
    local y = baz()
    return x, y
  end
Iow, in Lua it is irrelevant whether something yields or not, so you don’t care if it is async.


But I think I do care.

When I see

    var x = await bar();
I know that other code, outside bar(), may have run during the execution of that statement.

Also, the JS approach makes composing asynchronous operations simple:

   var x = bar();
   var y = bar();
   return await x, await y;
Both bar invocations can run in parallel. If, say, each invocation of bar fires off an Ajax request that takes a few seconds to come back, that can be a significant saving.

It's unclear (to me) how that would be done in Lua without complicating the API.


It is a false security. If you call a function either you know what it does (so in lua you would know whether it call yield) or you don't and it could still be calling arbitrary code, so you have to code defensively and think about reentrancy. The only reason for await is that it normally needs to save less state (a single frame) than a full coroutine yield (a whole stack).

Also lua-like stackfull coroutines don't prevent firing multiple asynchronous operations at the same time (like in your example), they only make the waiting much more peasant.


OpenResty, for one, allows that via grouping queries, rather than results:

    local requests = { "/mysql", "/postgres", "/redis", "/memcached" }
    local responses = { ngx.location.capture_multi(requests) }

    for i, response in ipairs(responses) do
        print(response.foobar)
    end
But instead ngx could lazy-evaluate responses with help of metatables (see my other comment).


In theory, someone could write a Babel plugin that wraps all files with `(async () => { /.../ })()` and prepends all function/method calls with `await ` (I think). I personally like the explicitness but maybe it's just because I'm used to it. I also wonder how that would work with `Promise.all()`.


We're now sorting out things for third-party js library written manually that way. It is a mess.

>Promise.all()

Objects returned from yields_deep_inside() may be implemented as lazy-evaluated, i.e. only `print(result.items)` or explicit `await(result)` will yield upon use, while request will be sent immediately. Thus the order of execution will depend purely on natural use case, not on programmed await sequences/groups. Since js is not parallel, you'll touch A or B first, not both.

How to sort it all out is a responsibility of an event framework, not of green thread abstraction that is coroutine.

One more thing: idk how js optimizes endless closures that are spawned as promise callbacks, but Lua coroutine's yield/resume is as cheap as return from / pass control to a VM. No closures are created to retain state across async calls, because VM stack is state itself. I suspect wrapping everything in async-await will simply trash GC and VM performance. In a sense, asynchrony is only emulated in javascript with a cost, though I may be wrong and it all is optimized out. We have to wait for jsvm implementation experts to [dis]prove that.


So what happens to the call stack above foo? It looks like `await` is implied and the program stops until `bar` is done.


The entire green thread is paused at yield() and resumed when other [controlling] thread calls resume() on it. Event loop has to account which thread yielded on which event source and resume corresponding thread when event arrives. Program as a whole doesn’t stop, only sequential blocking paths are put on waiting list. If each incoming query starts a new thread, then all queries are run concurrently around event/io loop, implicitly switching on deep yield() that programmer never spells out loud. That’s how e.g. openresty (nginx+lua) handlers work.


Surely it's nice not having to worry about it, but how is that better over explicit await?

If a sub-sub-sub-function suddenly decides to yield, you might not even know that your whole path is waiting for that. Right? Perhaps you need baz() to run asap but that isn't clear without looking deep into bar().


One big advantage is that higher order code can work with both sync and async code. The simplest example would be a for each iterator but this also applies to anything else that calls a function or method that is passed in


Yeah it's a coroutine, on yield the system suspends the entire stack: http://www.lua.org/manual/5.1/manual.html#2.11


Well, I do want to care if it's async. I don't think it's a good design. Async keyword is making asynchronous code easier, but it's still asynchronous code with its pitfall, I think that hiding it under ordinary calls is a bad design.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: