Hello I got a couple questions related to the topic, and don't want to make various threads about the somewhat same topic so here we go ... 🧵Context 🧵 I'm new to web automation/scraping and building an little application to automate an complex user flow and noticed in the network tab that it sends lots of track events, and for it to work the user needs to be logged I suppose that unusual activity can be easily tracked . The process should be running each 1 hour or maybe each 30 mins
1️⃣ - Can browsers detected if I block image loading ? Would love to block them to gain some performance
2️⃣ - What's is an acceptable human time interval between actions ? And are those should be different for button presses/scrolling ?
3️⃣ - Is it viable to store which pages I have already visited to reduce the total number of requests ? If yes what's the most recommended way to store retrieve them ? Was thinking about storing them for about 24/h in a JSON and after that period send to another JSON for long term storage for another module of the application to use
Sorry If I'm asking too much, tried to look up all but just managed to get starter results that where not helpful at all and have the silly cat in the attachment as an thanks for reading all of that 😇 💕
This thread is trying to answer question "Can browsers detect if I block image loading? What is an acceptable human time interval between actions? Is it viable to store which pages I have already visited to reduce the total number of requests? What's the most recommended way to store and retrieve them?"
Rayrun is a community for QA engineers. I am constantly looking for new ways to add value to people learning Playwright and other browser automation frameworks. If you have feedback, email [email protected].