Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
338 views
in Technique[技术] by (71.8m points)

python - Export csv file from scrapy (not via command line)

I successfully tried to export my items into a csv file from the command line like:

   scrapy crawl spiderName -o filename.csv

My question is: What is the easiest solution to do the same in the code? I need this as i extract the filename from another file. End scenario should be, that i call

  scrapy crawl spiderName

and it writes the items into filename.csv

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Why not use an item pipeline?

WriteToCsv.py

   import csv
   from YOUR_PROJECT_NAME_HERE import settings

   def write_to_csv(item):
       writer = csv.writer(open(settings.csv_file_path, 'a'), lineterminator='
')
       writer.writerow([item[key] for key in item.keys()])

   class WriteToCsv(object):
        def process_item(self, item, spider):
            write_to_csv(item)
            return item

settings.py

   ITEM_PIPELINES = { 'project.pipelines_path.WriteToCsv.WriteToCsv' : A_NUMBER_HIGHER_THAN_ALL_OTHER_PIPELINES}
   csv_file_path = PATH_TO_CSV

If you wanted items to be written to separate csv for separate spiders you could give your spider a CSV_PATH field. Then in your pipeline use your spiders field instead of path from setttigs.

This works I tested it in my project.

HTH

http://doc.scrapy.org/en/latest/topics/item-pipeline.html


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...