-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gzip sizes #23
Comments
So there are some challenges here. First off - I'm trying to keep this as consistent as I can for both the plugin and the web version. In the web version - all we have to go on is the stats.json, which includes:
It does not include the actual source of the output bundles - which means that we don't have anything to actually run gzip on. And even if we did - the best we could do is gzip the entire bundle and then use the post-loader size percentages to estimate a gzip size for each module. This is what I'm doing for the minified sizes already. It might be possible to work around some of these issues in the plugin version alone - since it can get access to the actual bundle source. Some other tools like webpack-bundle-analyzer do some clever things like parsing the bundle source and figuring out where each module is located in it - which yields a very accurate minified size. However running gzip on those modules one-at-a-time yields an overestimate since the bundle as a whole gzips much more efficiently. I'm thinking though - that multiplying the percentage by the total gzipped size would probably be close enough. So that might be an option for the plugin version... |
@chrisbateman wow. great to see such a response. I wasn't aware that there is a client side version of this (I forgot about the hosted version with the upload ability) which obviously does not see the source anymore. But the work you did do, and so quickly, is spot on. An estimate on gzip per file is pretty good. Of course the entire pack might more efficiently pack, but the reason I asked for source level compression is to detect the impact of included code that is harder to compress (eg embedded base 64 encoded jpegs or similar which usually dont compress as well as the rest). Of course these cases are rare, I just happened to have something like this. |
sorry did not want to close it, in case you might want to track this |
Hiya! Do you think we could combine our efforts for this idea and I'm looking at solving webpack-contrib/webpack-bundle-analyzer#32 right now, and would want to collaborate with other npm packages trying to solve the same issue |
@valscion The main complication here is that this is both a plugin and a website - and the website only gets the stats.json data. What if we had a module that was soley responsible for stats extraction - and it could accept either stats.json alone - or that plus the other info you can get from from a plugin? |
It seems that webpack-bundle-analyzer needs access to the filesystem to get other sizes besides the parsed one. The package has a CLI with it that can be fed a stats.json file but unless also given a bundle directory location, it can't tell gzip and minified sizes, too. I'm not sure how everything works under the hood but I'll see if we could do with just the stats.json if it would contain the sources, too |
This is a nice plugin, but it should be possible to take the actual files content (not just the size) and run an on-the-fly gzip (e.g using zlib) to calculate the correct gzipped size (at least a better approximation) ? Does the code use the file content or just the sizes ?
zlib.gzip(code, (err, zipped) => { // zipped.length })
The text was updated successfully, but these errors were encountered: