Chromium Code Reviews
chromiumcodereview-hr@appspot.gserviceaccount.com (chromiumcodereview-hr) | Please choose your nickname with Settings | Help | Chromium Project | Gerrit Changes | Sign out
(9)

Side by Side Diff: tools/findit/crash_utils.py

Issue 465403004: [Findit] Support sending cookies for http requests. (Closed) Base URL: svn://svn.chromium.org/chrome/trunk/src
Patch Set: Rebase. Created 6 years, 4 months ago
Use n/p to move between diff chunks; N/P to move between comments. Draft comments are only viewable by you.
Jump to:
View unified diff | Download patch | Annotate | Revision Log
« no previous file with comments | « tools/findit/common/utils.py ('k') | tools/findit/https.py » ('j') | no next file with comments »
Toggle Intra-line Diffs ('i') | Expand Comments ('e') | Collapse Comments ('c') | Show Comments Hide Comments ('s')
OLDNEW
1 # Copyright (c) 2014 The Chromium Authors. All rights reserved. 1 # Copyright (c) 2014 The Chromium Authors. All rights reserved.
2 # Use of this source code is governed by a BSD-style license that can be 2 # Use of this source code is governed by a BSD-style license that can be
3 # found in the LICENSE file. 3 # found in the LICENSE file.
4 4
5 import cgi 5 import cgi
6 import ConfigParser 6 import ConfigParser
7 import json 7 import json
8 import logging 8 import logging
9 import os 9 import os
10 import time 10 import time
11 import urllib2 11 import urllib2
12 12
13 from common import utils
13 from result import Result 14 from result import Result
14 15
15 16
16 INFINITY = float('inf') 17 INFINITY = float('inf')
17 18
18 19
19 def ParseURLsFromConfig(file_name): 20 def ParseURLsFromConfig(file_name):
20 """Parses URLS from the config file. 21 """Parses URLS from the config file.
21 22
22 The file should be in python config format, where svn section is in the 23 The file should be in python config format, where svn section is in the
(...skipping 165 matching lines...) Expand 10 before | Expand all | Expand 10 after
188 189
189 Args: 190 Args:
190 url: URL to get data from. 191 url: URL to get data from.
191 retries: Number of times to retry connection. 192 retries: Number of times to retry connection.
192 sleep_time: Time in seconds to wait before retrying connection. 193 sleep_time: Time in seconds to wait before retrying connection.
193 timeout: Time in seconds to wait before time out. 194 timeout: Time in seconds to wait before time out.
194 195
195 Returns: 196 Returns:
196 None if the data retrieval fails, or the raw data. 197 None if the data retrieval fails, or the raw data.
197 """ 198 """
198 data = None 199 count = 0
199 for i in range(retries): 200 while True:
201 count += 1
200 # Retrieves data from URL. 202 # Retrieves data from URL.
201 try: 203 try:
202 data = urllib2.urlopen(url, timeout=timeout) 204 _, data = utils.GetHttpClient().Get(url)
203 205 return data
204 # If retrieval is successful, return the data.
205 if data:
206 return data.read()
207
208 # If retrieval fails, try after sleep_time second.
209 except urllib2.URLError:
210 time.sleep(sleep_time)
211 continue
212 except IOError: 206 except IOError:
213 time.sleep(sleep_time) 207 if count < retries:
214 continue 208 # If retrieval fails, try after sleep_time second.
209 time.sleep(sleep_time)
210 else:
211 break
215 212
216 # Return None if it fails to read data from URL 'retries' times. 213 # Return None if it fails to read data from URL 'retries' times.
217 return None 214 return None
218 215
219 216
220 def FindMinLineDistance(crashed_line_list, changed_line_numbers): 217 def FindMinLineDistance(crashed_line_list, changed_line_numbers):
221 """Calculates how far the changed line is from one of the crashes. 218 """Calculates how far the changed line is from one of the crashes.
222 219
223 Finds the minimum distance between the lines that the file crashed on 220 Finds the minimum distance between the lines that the file crashed on
224 and the lines that the file changed. For example, if the file crashed on 221 and the lines that the file changed. For example, if the file crashed on
(...skipping 214 matching lines...) Expand 10 before | Expand all | Expand 10 after
439 # Blame object does not have review url and reviewers. 436 # Blame object does not have review url and reviewers.
440 review_url = None 437 review_url = None
441 reviewers = None 438 reviewers = None
442 line_content = blame.line_content 439 line_content = blame.line_content
443 440
444 result = Result(suspected_cl, revision_url, component_name, author, reason, 441 result = Result(suspected_cl, revision_url, component_name, author, reason,
445 review_url, reviewers, line_content) 442 review_url, reviewers, line_content)
446 result_list.append(result) 443 result_list.append(result)
447 444
448 return result_list 445 return result_list
OLDNEW
« no previous file with comments | « tools/findit/common/utils.py ('k') | tools/findit/https.py » ('j') | no next file with comments »

Powered by Google App Engine
This is Rietveld 408576698