Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
601 views
in Technique[技术] by (71.8m points)

python - Pass scraped URL's from one spider to another

How can I send the scraped URL's from one spider to the start_urls of another spider?

Specifically, I want to run one spider which gets a list of URL's from an XML page. After the URL's have been retrieved I want them to by used by another spider for scraping.

from scrapy.spiders import SitemapSpider

class Daily(SitemapSpider):
    name = 'daily'
    sitemap_urls = ['http://example.com/sitemap.xml']

    def parse(self, response):
        print response.url

        # How do I send these URL's to another spider instead?

        yield {
            'url': response.url
        }
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

From first spider you can save url in some DB or send to some queue (Zerro, Rabbit MQ, Redis) for example via pipeline.

Second spider can get the url with method - start_requests

class MySpider(scrapy.Spider):
    name = 'myspider'

    def start_requests(self):
        urls = my_db.orm.get('urls');
        for url in urls:
            yield scrapy.Request(url)

Or urls can be passed to spider from queue broker via cli or API. Or spider can be just launched from broker and launched spider get his url by start_requests.

Really exists many ways how you can do it. The way depend of the criteria why you need to pass urls from one spider to other.

You can check this projects: Scrapy-Cluster, Scrapy-Redis. May be its what you searching for.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...