(no title)
mozillas | 7 years ago
That's what I did, but I might have had different requirements. If you don't have a lot to crawl and you don't have to do it very often(once a week or less), you can probably space out the requests enough so that the server doesn't feel it. It helps a lot if you use some caching as well for the website itself in this case. I think it depends a lot on the requirements of the project. But using two machines is safer I think, although it might complicate things a bit.
Keep in mind that there's probably better technical advice out there than mine. I'm a hobbyist developer.
No comments yet.