I'm using Playwright-Python to generate PDFs inside of a Celery task. This works great, except the chrome process sticks around afterwards, even when browser.close
is called, and with sync_playwright
used as a context.
Interestingly, the process appears to get properly closed when running this locally in Linux, and macOS, but in production, running inside a Docker container + Celery task, it does not. The node
driver process is properly cleaned up.
Curious if anyone has any ideas about what might be going on, or if there's further debug logs that could be enabled.
Thanks.
with sync_playwright() as p:
browser = None
try:
logger.info("Launching browser")
browser = p.chromium.launch(args=["--font-render-hinting=none"])
page = browser.new_page()
page.on("console", lambda msg: logger.info(msg.text))
browser_error = None
def on_pageerror(err):
nonlocal browser_error
browser_error = err
page.on("pageerror", on_pageerror)
logger.info("Loading template in browser")
page.goto(f"file://{template_path}", wait_until="networkidle")
logger.info("Exporting as PDF")
pdf_bytes = page.pdf(print_background=True)
if browser_error:
raise RuntimeError(
f"Browser error while generating PDF: {browser_error}"
)
finally:
if browser:
browser.close()
This thread is trying to answer question "Why does the Chrome process remain after closing in a Celery task using Playwright-Python when running inside a Docker container?"
Quoting from the docs:
Rayrun is a community for QA engineers. I am constantly looking for new ways to add value to people learning Playwright and other browser automation frameworks. If you have feedback, email [email protected].