Python 标准库参考指南 2.7.18
. . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.4 Numeric Types —int, float, long, complex . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.5 迭代器类型 . . . . . . . . . . . . . . file writer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806 18.9 mimify —MIME processing of mail messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 807 18.10 multifile . . . . . . . . . . . . . . . . . . . . 1380 37.7 autoGIL —Global Interpreter Lock handling in event loops . . . . . . . . . . . . . . . . . . . . . . 1383 37.8 Mac OS Toolbox Modules . . . . . . .0 码力 | 1552 页 | 7.42 MB | 9 月前3Python 标准库参考指南 2.7.18
. . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.4 Numeric Types —int, float, long, complex . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.5 迭代器类型 . . . . . . . . . . . . . . file writer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806 18.9 mimify —MIME processing of mail messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 807 18.10 multifile . . . . . . . . . . . . . . . . . . . . 1380 37.7 autoGIL —Global Interpreter Lock handling in event loops . . . . . . . . . . . . . . . . . . . . . . 1383 37.8 Mac OS Toolbox Modules . . . . . . .0 码力 | 1552 页 | 7.42 MB | 9 月前3Python 标准库参考指南 2.7.18
. . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.4 Numeric Types —int, float, long, complex . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.5 迭代器类型 . . . . . . . . . . . . . . file writer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806 18.9 mimify —MIME processing of mail messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 807 18.10 multifile . . . . . . . . . . . . . . . . . . . . 1380 37.7 autoGIL —Global Interpreter Lock handling in event loops . . . . . . . . . . . . . . . . . . . . . . 1383 37.8 Mac OS Toolbox Modules . . . . . . .0 码力 | 1552 页 | 7.42 MB | 9 月前3Scrapy 2.6 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 384 页 | 1.63 MB | 1 年前3Scrapy 2.7 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 401 页 | 1.67 MB | 1 年前3Scrapy 2.9 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 409 页 | 1.70 MB | 1 年前3Scrapy 2.8 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 405 页 | 1.69 MB | 1 年前3Scrapy 2.10 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 419 页 | 1.73 MB | 1 年前3Celery 2.0 Documentation
AsyncResult: >>> result = add.delay(4, 4) >>> result.ready() # returns True if the task has finished processing. False >>> result.result # task is not ready, so no return value yet. None >>> result.get() # the tasks won’t run long enough to block the worker from processing other waiting tasks. However, there’s a limit. Sending messages takes processing power and bandwidth. If your tasks are so short the overhead should reconsider your strategy. There is no universal answer here. Data locality The worker processing the task should be as close to the data as possible. The best would be to have a copy in memory0 码力 | 165 页 | 492.43 KB | 1 年前3Scrapy 2.11.1 Documentation
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 5.9 Downloading and processing files and images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 5.10 Deploying structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival. Even though Scrapy was originally designed for web scraping, it can also projects (like the one in this tutorial), that should be enough. However, if you want to perform more complex things with the scraped items, you can write an Item Pipeline. A placeholder file for Item Pipelines0 码力 | 425 页 | 1.79 MB | 1 年前3
共 527 条
- 1
- 2
- 3
- 4
- 5
- 6
- 53