scrapy / scrapy
Showing 1 of 2 files from the diff.

@@ -3,7 +3,7 @@
Loading
3 3
import functools
4 4
import logging
5 5
from collections import defaultdict
6 -
from twisted.internet.defer import Deferred, DeferredList
6 +
from twisted.internet.defer import Deferred, DeferredList, _DefGen_Return
7 7
from twisted.python.failure import Failure
8 8
9 9
from scrapy.settings import Settings
@@ -139,6 +139,30 @@
Loading
139 139
            result.cleanFailure()
140 140
            result.frames = []
141 141
            result.stack = None
142 +
143 +
            # This code fixes a memory leak by avoiding to keep references to
144 +
            # the Request and Response objects on the Media Pipeline cache.
145 +
            #
146 +
            # Twisted inline callbacks pass return values using the function
147 +
            # twisted.internet.defer.returnValue, which encapsulates the return
148 +
            # value inside a _DefGen_Return base exception.
149 +
            #
150 +
            # What happens when the media_downloaded callback raises another
151 +
            # exception, for example a FileException('download-error') when
152 +
            # the Response status code is not 200 OK, is that it stores the
153 +
            # _DefGen_Return exception on the FileException context.
154 +
            #
155 +
            # To avoid keeping references to the Response and therefore Request
156 +
            # objects on the Media Pipeline cache, we should wipe the context of
157 +
            # the exception encapsulated by the Twisted Failure when its a
158 +
            # _DefGen_Return instance.
159 +
            #
160 +
            # This problem does not occur in Python 2.7 since we don't have
161 +
            # Exception Chaining (https://www.python.org/dev/peps/pep-3134/).
162 +
            context = getattr(result.value, '__context__', None)
163 +
            if isinstance(context, _DefGen_Return):
164 +
                setattr(result.value, '__context__', None)
165 +
142 166
        info.downloading.remove(fp)
143 167
        info.downloaded[fp] = result  # cache result
144 168
        for wad in info.waiting.pop(fp):
Files Coverage
scrapy 85.46%
Project Totals (169 files) 85.46%
6532.5
TRAVIS_PYTHON_VERSION=3.4
TRAVIS_OS_NAME=linux
TOXENV=py34
6532.1
TRAVIS_PYTHON_VERSION=2.7
TRAVIS_OS_NAME=linux
TOXENV=py27
6532.6
TRAVIS_PYTHON_VERSION=3.5
TRAVIS_OS_NAME=linux
TOXENV=py35
6532.2
TRAVIS_PYTHON_VERSION=2.7
TRAVIS_OS_NAME=linux
TOXENV=jessie
6532.8
TRAVIS_PYTHON_VERSION=3.7
TRAVIS_OS_NAME=linux
TOXENV=py37
1
comment:
2
  layout: "header, diff, tree"
3

4
coverage:
5
  status:
6
    project: false
Sunburst
The inner-most circle is the entire project, moving away from the center are folders then, finally, a single file. The size and color of each slice is representing the number of statements and the coverage, respectively.
Icicle
The top section represents the entire project. Proceeding with folders and finally individual files. The size and color of each slice is representing the number of statements and the coverage, respectively.
Grid
Each block represents a single file in the project. The size and color of each block is represented by the number of statements and the coverage, respectively.
Loading