Please provide a short (approximately 100 word) summary of the following web Content, written in the voice of the original author. If there is anything controversial please highlight the controversy. If there is something surprising, unique, or clever, please highlight that as well. Content: Title: Launch HN: Codeparrot (YC W23) – Automated API testing using production traffic Site: news.ycombinator.com Hi HN, we’re Royal and Vedant, co-founders of CodeParrot (https://www.codeparrot.ai/). CodeParrot automates API testing so developers can speed up release cycles and increase test coverage. It captures production traffic and database state to generate test cases that update with every release.
Here’s a short video that shows how it works: https://www.loom.com/share/dd6c12e23ceb43f587814a2fbc165c1f .
As managers of engineering teams (I was CTO of an ed-tech startup, Vedant was the founding engineer of a unicorn company) both of us faced challenges in enforcing high test coverage. We ended up relying a lot on manual testing but it became hard to scale, and led to reduced velocity and higher production bugs. This motivated us to build CodeParrot.
How it works: we auto-instrument backend services to capture production traffic. Requests and responses coming to your backend service, as well as the downstream calls made by it like DB calls are stored. As part of your CI pipeline, we replay the captured requests whenever your service is updated. The responses are compared with the responses from production env and regressions are highlighted to the developers. To ensure that the same codebase gives the same response in CI environment and production, we mock all downstream calls with the values from production.
Most tools to record and replay production traffic for the purpose of testing capture traffic on the network layer (as sidecar or through load balancer), CodeParrot instead relies on an instrumentation agent (built on top of OpenTelemetry) to capture traffic, enabling us to capture downstream request/response like database responses which are otherwise encrypted on network layer. This helps us mock downstream calls and compare the response from CI environment vs production environment. Additionally, this helps us sample requests based on code flow and downstream responses which provide better test coverage compared to just relying on API headers & parameters.
Our self-serve product will be out in a few weeks. Meanwhile, we can help you integrate CodeParrot, please reach out at email@example.com or you can choose a slot here - https://tidycal.com/royal1/schedule-demo. We’ll be selling CodeParrot via a subscription model but the details are TBD. In addition, we will be open sourcing the project soon.
If you’ve already tried or are thinking of using tools in this space, we’d love to hear your experience and what you care about most. We look forward to everyone’s comments!