Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thanks!

As far as the optimizations: we transpile the code to different target runtime languages where we can automatically instrument different optimizations. Currently our microservice runtime is Scala code and we can add a cache layer around any function application or a parallelization queue around any map-like operations on iterable types (e.g., List).

The application deployment UI let's you pick the points of optimization (a top level function for a cache, or map-like application for a parallelization). Then you can pick the tier of optimization implementation you want (e.g., local in-memory map vs remote Redis instance; local thread-based parallelization vs remote actor-based cluster parallelization)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: