(no title)
digitalsanctum | 2 years ago
1. Ability to filter response properties.
2. Ability to work with non-JSON (web scraping) by defining a mapping of CSS selectors to response properties.
3. Cross-reference host names of captured requests with publicly documented APIs.
4. If auth headers are found, prompt user for credentials that can then be stored locally.
5. "Repeater" similarly found in Burp Suite.
6. Generate clients on the fly based on the generated OpenAPI spec.
worldsayshi|2 years ago
- Integration with some kind of web crawler to allow automatically walking a web site and extract a database of specifications
Edit: Hmm, it seems that genson-js[1] was used to merge schemas.
1 - https://www.npmjs.com/package/genson-js
mrmagoo2|2 years ago
The idea for a crawler is a good one. The core logic that handles spec generation is decoupled from everything else, so it can be extracted into its own library.
But there are approaches that exist for this already, such as har-to-openapi.
https://github.com/jonluca/har-to-openapi
digitalsanctum|2 years ago
8. Optionally publish generated OpenAPI specs to a central site or open PR to a GH repo, "awesome-openapi-devtools"?
mrmagoo2|2 years ago