Apples and oranges. This is a young implementation and nothing considers its performance when generating a PDF. Until it's had significant optimization you can't draw very good conclusions about html+js vs. native.
This is wrong because PDF.js is Firefox's default way of presenting PDFs. If it's a young unoptimized code base, then maybe Firefox shouldn't make it the default. It is Apples to Apples comparison here because it's the default and how young it may be is irrelevant to the user experience.
It's a very dangerous attitude "I don't read PDF's and I don't care but the customers who read them should tolerate our young poor implementation that needs more than two minutes for just 15 pages."
They won't. As soon as you prevent them doing their job (and it's so if they used their PDF's normally before you changed the defaults) they will have to search for the solution. The solution is either switch the handler (still a little better for browser writers) or the browser.
By making the opening and looking at the 15-page PDF which was before instantaneous taking two minutes, you prevent them doing their job (the slowdown from subjectively 0 seconds to minutes is also subjectively infinitely worse experience!) and they must respond. They can't open just the first page. They actually care. They need all the pages.
Why do you keep acting like a document cannot be interacted with until every single page is fully rendered? Especially when it goes out of order to get the onscreen pages ready first.
When you need some information from the 15-pages document you don't think "I know I need the 9th page. You look at one page after another. You need a 0 seconds for that with a native renderer (you can't observe that you wait) and you need more minutes with pdf.js -- infinitely longer, enough to not use it.