--- python3.4-3.4.1.orig/debian/2to3-3.1 +++ python3.4-3.4.1/debian/2to3-3.1 @@ -0,0 +1,41 @@ +.\" DO NOT MODIFY THIS FILE! It was generated by help2man 1.40.4. +.TH 2TO3-3.3 "1" "January 2012" "2to3-3.3 3.3" "User Commands" +.SH NAME +2to3-3.3 \- Python2 to Python3 converter +.SH SYNOPSIS +.B 2to3 +[\fIoptions\fR] \fIfile|dir \fR... +.SH OPTIONS +.TP +\fB\-h\fR, \fB\-\-help\fR +show this help message and exit +.TP +\fB\-d\fR, \fB\-\-doctests_only\fR +Fix up doctests only +.TP +\fB\-f\fR FIX, \fB\-\-fix\fR=\fIFIX\fR +Each FIX specifies a transformation; default: all +.TP +\fB\-j\fR PROCESSES, \fB\-\-processes\fR=\fIPROCESSES\fR +Run 2to3 concurrently +.TP +\fB\-x\fR NOFIX, \fB\-\-nofix\fR=\fINOFIX\fR +Prevent a transformation from being run +.TP +\fB\-l\fR, \fB\-\-list\-fixes\fR +List available transformations +.TP +\fB\-p\fR, \fB\-\-print\-function\fR +Modify the grammar so that print() is a function +.TP +\fB\-v\fR, \fB\-\-verbose\fR +More verbose logging +.TP +\fB\-\-no\-diffs\fR +Don't show diffs of the refactoring +.TP +\fB\-w\fR, \fB\-\-write\fR +Write back modified files +.TP +\fB\-n\fR, \fB\-\-nobackups\fR +Don't write backups for modified files --- python3.4-3.4.1.orig/debian/FAQ.html +++ python3.4-3.4.1/debian/FAQ.html @@ -0,0 +1,8997 @@ + + +The Whole Python FAQ + + + +

The Whole Python FAQ

+Last changed on Wed Feb 12 21:31:08 2003 CET + +

(Entries marked with ** were changed within the last 24 hours; +entries marked with * were changed within the last 7 days.) +

+ +

+


+

1. General information and availability

+ + +

+


+

2. Python in the real world

+ + +

+


+

3. Building Python and Other Known Bugs

+ + +

+


+

4. Programming in Python

+ + +

+


+

5. Extending Python

+ + +

+


+

6. Python's design

+ + +

+


+

7. Using Python on non-UNIX platforms

+ + +

+


+

8. Python on Windows

+ + +
+

1. General information and availability

+ +
+

1.1. What is Python?

+Python is an interpreted, interactive, object-oriented programming +language. It incorporates modules, exceptions, dynamic typing, very +high level dynamic data types, and classes. Python combines +remarkable power with very clear syntax. It has interfaces to many +system calls and libraries, as well as to various window systems, and +is extensible in C or C++. It is also usable as an extension language +for applications that need a programmable interface. Finally, Python +is portable: it runs on many brands of UNIX, on the Mac, and on PCs +under MS-DOS, Windows, Windows NT, and OS/2. +

+To find out more, the best thing to do is to start reading the +tutorial from the documentation set (see a few questions further +down). +

+See also question 1.17 (what is Python good for). +

+ +Edit this entry / +Log info + +/ Last changed on Mon May 26 16:05:18 1997 by +GvR +

+ +


+

1.2. Why is it called Python?

+Apart from being a computer scientist, I'm also a fan of "Monty +Python's Flying Circus" (a BBC comedy series from the seventies, in +the -- unlikely -- case you didn't know). It occurred to me one day +that I needed a name that was short, unique, and slightly mysterious. +And I happened to be reading some scripts from the series at the +time... So then I decided to call my language Python. +

+By now I don't care any more whether you use a Python, some other +snake, a foot or 16-ton weight, or a wood rat as a logo for Python! +

+ +Edit this entry / +Log info + +/ Last changed on Thu Aug 24 00:50:41 2000 by +GvR +

+ +


+

1.3. How do I obtain a copy of the Python source?

+The latest Python source distribution is always available from +python.org, at http://www.python.org/download. The latest development sources can be obtained via anonymous CVS from SourceForge, at http://www.sf.net/projects/python . +

+The source distribution is a gzipped tar file containing the complete C source, LaTeX +documentation, Python library modules, example programs, and several +useful pieces of freely distributable software. This will compile and +run out of the box on most UNIX platforms. (See section 7 for +non-UNIX information.) +

+Older versions of Python are also available from python.org. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Apr 9 17:06:16 2002 by +A.M. Kuchling +

+ +


+

1.4. How do I get documentation on Python?

+All documentation is available on-line, starting at http://www.python.org/doc/. +

+The LaTeX source for the documentation is part of the source +distribution. If you don't have LaTeX, the latest Python +documentation set is available, in various formats like postscript +and html, by anonymous ftp - visit the above URL for links to the +current versions. +

+PostScript for a high-level description of Python is in the file nluug-paper.ps +(a separate file on the ftp site). +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jan 21 12:02:55 1998 by +Ken Manheimer +

+ +


+

1.5. Are there other ftp sites that mirror the Python distribution?

+The following anonymous ftp sites keep mirrors of the Python +distribution: +

+USA: +

+

+        ftp://ftp.python.org/pub/python/
+        ftp://gatekeeper.dec.com/pub/plan/python/
+        ftp://ftp.uu.net/languages/python/
+        ftp://ftp.wustl.edu/graphics/graphics/sgi-stuff/python/
+        ftp://ftp.sterling.com/programming/languages/python/
+        ftp://uiarchive.cso.uiuc.edu/pub/lang/python/
+        ftp://ftp.pht.com/mirrors/python/python/
+	ftp://ftp.cdrom.com/pub/python/
+
+Europe: +

+

+        ftp://ftp.cwi.nl/pub/python/
+        ftp://ftp.funet.fi/pub/languages/python/
+        ftp://ftp.sunet.se/pub/lang/python/
+        ftp://unix.hensa.ac.uk/mirrors/uunet/languages/python/
+        ftp://ftp.lip6.fr/pub/python/
+        ftp://sunsite.cnlab-switch.ch/mirror/python/
+        ftp://ftp.informatik.tu-muenchen.de/pub/comp/programming/languages/python/
+
+Australia: +

+

+        ftp://ftp.dstc.edu.au/pub/python/
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Mar 24 09:20:49 1999 by +A.M. Kuchling +

+ +


+

1.6. Is there a newsgroup or mailing list devoted to Python?

+There is a newsgroup, comp.lang.python, +and a mailing list. The newsgroup and mailing list are gatewayed into +each other -- if you can read news it's unnecessary to subscribe to +the mailing list. To subscribe to the mailing list +(python-list@python.org) visit its Mailman webpage at +http://www.python.org/mailman/listinfo/python-list +

+More info about the newsgroup and mailing list, and about other lists, +can be found at +http://www.python.org/psa/MailingLists.html. +

+Archives of the newsgroup are kept by Deja News and accessible +through the "Python newsgroup search" web page, +http://www.python.org/search/search_news.html. +This page also contains pointer to other archival collections. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jun 23 09:29:36 1999 by +GvR +

+ +


+

1.7. Is there a WWW page devoted to Python?

+Yes, http://www.python.org/ is the official Python home page. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 14:42:59 1997 by +Ken Manheimer +

+ +


+

1.8. Is the Python documentation available on the WWW?

+Yes. Python 2.0 documentation is available from +http://www.pythonlabs.com/tech/python2.0/doc/ and from +http://www.python.org/doc/. Note that most documentation +is available for on-line browsing as well as for downloading. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 03:14:08 2001 by +Moshe Zadka +

+ +


+

1.9. Are there any books on Python?

+Yes, many, and more are being published. See +the python.org Wiki at http://www.python.org/cgi-bin/moinmoin/PythonBooks for a list. +

+You can also search online bookstores for "Python" +(and filter out the Monty Python references; or +perhaps search for "Python" and "language"). +

+ +Edit this entry / +Log info + +/ Last changed on Mon Aug 5 19:08:49 2002 by +amk +

+ +


+

1.10. Are there any published articles about Python that I can reference?

+If you can't reference the web site, and you don't want to reference the books +(see previous question), there are several articles on Python that you could +reference. +

+Most publications about Python are collected on the Python web site: +

+

+    http://www.python.org/doc/Publications.html
+
+It is no longer recommended to reference this +very old article by Python's author: +

+

+    Guido van Rossum and Jelke de Boer, "Interactively Testing Remote
+    Servers Using the Python Programming Language", CWI Quarterly, Volume
+    4, Issue 4 (December 1991), Amsterdam, pp 283-303.
+
+

+ +Edit this entry / +Log info + +/ Last changed on Sat Jul 4 20:52:31 1998 by +GvR +

+ +


+

1.11. Are there short introductory papers or talks on Python?

+There are several - you can find links to some of them collected at +http://www.python.org/doc/Hints.html#intros. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 15:04:05 1997 by +Ken Manheimer +

+ +


+

1.12. How does the Python version numbering scheme work?

+Python versions are numbered A.B.C or A.B. A is the major version +number -- it is only incremented for really major changes in the +language. B is the minor version number, incremented for less +earth-shattering changes. C is the micro-level -- it is +incremented for each bugfix release. See PEP 6 for more information +about bugfix releases. +

+Not all releases have bugfix releases. +Note that in the past (ending with 1.5.2), +micro releases have added significant changes; +in fact the changeover from 0.9.9 to 1.0.0 was the first time +that either A or B changed! +

+Alpha, beta and release candidate versions have an additional suffixes. +The suffix for an alpha version is "aN" for some small number N, the +suffix for a beta version is "bN" for some small number N, and the +suffix for a release candidate version is "cN" for some small number N. +

+Note that (for instance) all versions labeled 2.0aN precede the +versions labeled 2.0bN, which precede versions labeled 2.0cN, and +those precede 2.0. +

+As a rule, no changes are made between release candidates and the final +release unless there are show-stopper bugs. +

+You may also find version numbers with a "+" suffix, e.g. "2.2+". +These are unreleased versions, built directly from the CVS trunk. +

+See also the documentation for sys.version, sys.hexversion, and +sys.version_info. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jan 14 06:34:17 2002 by +GvR +

+ +


+

1.13. How do I get a beta test version of Python?

+All releases, including alphas, betas and release candidates, are announced on +comp.lang.python and comp.lang.python.announce newsgroups, +which are gatewayed into the python-list@python.org and +python-announce@python.org. In addition, all these announcements appear on +the Python home page, at http://www.python.org. +

+You can also access the development version of Python through CVS. See http://sourceforge.net/cvs/?group_id=5470 for details. If you're not familiar with CVS, documents like http://linux.oreillynet.com/pub/a/linux/2002/01/03/cvs_intro.html +provide an introduction. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 00:57:08 2002 by +Neal Norwitz +

+ +


+

1.14. Are there copyright restrictions on the use of Python?

+Hardly. You can do anything you want with the source, as long as +you leave the copyrights in, and display those copyrights in any +documentation about Python that you produce. Also, don't use the +author's institute's name in publicity without prior written +permission, and don't hold them responsible for anything (read the +actual copyright for a precise legal wording). +

+In particular, if you honor the copyright rules, it's OK to use Python +for commercial use, to sell copies of Python in source or binary form, +or to sell products that enhance Python or incorporate Python (or part +of it) in some form. I would still like to know about all commercial +use of Python! +

+ +Edit this entry / +Log info +

+ +


+

1.15. Why was Python created in the first place?

+Here's a very brief summary of what got me started: +

+I had extensive experience with implementing an interpreted language +in the ABC group at CWI, and from working with this group I had +learned a lot about language design. This is the origin of many +Python features, including the use of indentation for statement +grouping and the inclusion of very-high-level data types (although the +details are all different in Python). +

+I had a number of gripes about the ABC language, but also liked many +of its features. It was impossible to extend the ABC language (or its +implementation) to remedy my complaints -- in fact its lack of +extensibility was one of its biggest problems. +I had some experience with using Modula-2+ and talked with the +designers of Modula-3 (and read the M3 report). M3 is the origin of +the syntax and semantics used for exceptions, and some other Python +features. +

+I was working in the Amoeba distributed operating system group at +CWI. We needed a better way to do system administration than by +writing either C programs or Bourne shell scripts, since Amoeba had +its own system call interface which wasn't easily accessible from the +Bourne shell. My experience with error handling in Amoeba made me +acutely aware of the importance of exceptions as a programming +language feature. +

+It occurred to me that a scripting language with a syntax like ABC +but with access to the Amoeba system calls would fill the need. I +realized that it would be foolish to write an Amoeba-specific +language, so I decided that I needed a language that was generally +extensible. +

+During the 1989 Christmas holidays, I had a lot of time on my hand, +so I decided to give it a try. During the next year, while still +mostly working on it in my own time, Python was used in the Amoeba +project with increasing success, and the feedback from colleagues made +me add many early improvements. +

+In February 1991, after just over a year of development, I decided +to post to USENET. The rest is in the Misc/HISTORY file. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 00:06:23 1997 by +GvR +

+ +


+

1.16. Do I have to like "Monty Python's Flying Circus"?

+No, but it helps. Pythonistas like the occasional reference to SPAM, +and of course, nobody expects the Spanish Inquisition +

+The two main reasons to use Python are: +

+

+ - Portable
+ - Easy to learn
+
+The three main reasons to use Python are: +

+

+ - Portable
+ - Easy to learn
+ - Powerful standard library
+
+(And nice red uniforms.) +

+And remember, there is no rule six. +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 28 10:39:21 1997 by +GvR +

+ +


+

1.17. What is Python good for?

+Python is used in many situations where a great deal of dynamism, +ease of use, power, and flexibility are required. +

+In the area of basic text +manipulation core Python (without any non-core extensions) is easier +to use and is roughly as fast as just about any language, and this makes Python +good for many system administration type tasks and for CGI programming +and other application areas that manipulate text and strings and such. +

+When augmented with +standard extensions (such as PIL, COM, Numeric, oracledb, kjbuckets, +tkinter, win32api, etc.) +or special purpose extensions (that you write, perhaps using helper tools such +as SWIG, or using object protocols such as ILU/CORBA or COM) Python +becomes a very convenient "glue" or "steering" +language that helps make heterogeneous collections of unrelated +software packages work together. +For example by combining Numeric with oracledb you can help your +SQL database do statistical analysis, or even Fourier transforms. +One of the features that makes Python excel in the "glue language" role +is Python's simple, usable, and powerful C language runtime API. +

+Many developers also use Python extensively as a graphical user +interface development aide. +

+ +Edit this entry / +Log info + +/ Last changed on Sat May 24 10:13:11 1997 by +Aaron Watters +

+ +


+

1.18. Can I use the FAQ Wizard software to maintain my own FAQ?

+Sure. It's in Tools/faqwiz/ of the python source tree. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Mar 29 06:50:32 2002 by +Aahz +

+ +


+

1.19. Which editor has good support for editing Python source code?

+On Unix, the first choice is Emacs/XEmacs. There's an elaborate +mode for editing Python code, which is available from the Python +source distribution (Misc/python-mode.el). It's also bundled +with XEmacs (we're still working on legal details to make it possible +to bundle it with FSF Emacs). And it has its own web page: +

+

+    http://www.python.org/emacs/python-mode/index.html
+
+There are many other choices, for Unix, Windows or Macintosh. +Richard Jones compiled a table from postings on the Python newsgroup: +

+

+    http://www.bofh.asn.au/~richard/editors.html
+
+See also FAQ question 7.10 for some more Mac and Win options. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 15 23:21:04 1998 by +Gvr +

+ +


+

1.20. I've never programmed before. Is there a Python tutorial?

+There are several, and at least one book. +All information for beginning Python programmers is collected here: +

+

+    http://www.python.org/doc/Newbies.html
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Sep 5 05:34:07 2001 by +GvR +

+ +


+

1.21. Where in the world is www.python.org located?

+It's currently in Amsterdam, graciously hosted by XS4ALL: +

+

+    http://www.xs4all.nl
+
+Thanks to Thomas Wouters for setting this up!!!! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Aug 3 21:49:27 2001 by +GvR +

+ +


+

2. Python in the real world

+ +
+

2.1. How many people are using Python?

+Certainly thousands, and quite probably tens of thousands of users. +More are seeing the light each day. The comp.lang.python newsgroup is +very active, but overall there is no accurate estimate of the number of subscribers or Python users. +

+Jacek Artymiak has created a Python Users Counter; you can see the +current count by visiting +http://www.wszechnica.safenet.pl/cgi-bin/checkpythonuserscounter.py +(this will not increment the counter; use the link there if you haven't +added yourself already). Most Python users appear not to have registered themselves. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Feb 21 23:29:18 2002 by +GvR +

+ +


+

2.2. Have any significant projects been done in Python?

+At CWI (the former home of Python), we have written a 20,000 line +authoring environment for transportable hypermedia presentations, a +5,000 line multimedia teleconferencing tool, as well as many many +smaller programs. +

+At CNRI (Python's new home), we have written two large applications: +Grail, a fully featured web browser (see +http://grail.cnri.reston.va.us), +and the Knowbot Operating Environment, +a distributed environment for mobile code. +

+The University of Virginia uses Python to control a virtual reality +engine. See http://alice.cs.cmu.edu. +

+The ILU project at Xerox PARC can generate Python glue for ILU +interfaces. See ftp://ftp.parc.xerox.com/pub/ilu/ilu.html. ILU +is a free CORBA compliant ORB which supplies distributed object +connectivity to a host of platforms using a host of languages. +

+Mark Hammond and Greg Stein and others are interfacing Python to +Microsoft's COM and ActiveX architectures. This means, among other +things, that Python may be used in active server pages or as a COM +controller (for example to automatically extract from or insert information +into Excel or MSAccess or any other COM aware application). +Mark claims Python can even be a ActiveX scripting host (which +means you could embed JScript inside a Python application, if you +had a strange sense of humor). Python/AX/COM is distributed as part +of the PythonWin distribution. +

+The University of California, Irvine uses a student administration +system called TELE-Vision written entirely in Python. Contact: Ray +Price rlprice@uci.edu. +

+The Melbourne Cricket Ground (MCG) in Australia (a 100,000+ person venue) +has it's scoreboard system written largely in Python on MS Windows. +Python expressions are used to create almost every scoring entry that +appears on the board. The move to Python/C++ away from exclusive C++ +has provided a level of functionality that would simply not have been +viable otherwise. +

+See also the next question. +

+Note: this FAQ entry is really old. +See http://www.python.org/psa/Users.html for a more recent list. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Oct 25 13:24:15 2000 by +GvR +

+ +


+

2.3. Are there any commercial projects going on using Python?

+Yes, there's lots of commercial activity using Python. See +http://www.python.org/psa/Users.html for a list. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Oct 14 18:17:33 1998 by +ken +

+ +


+

2.4. How stable is Python?

+Very stable. New, stable releases have been coming out roughly every 3 to 12 months since 1991, and this seems likely to continue. +

+With the introduction of retrospective "bugfix" releases the stability of the language implementations can be, and is being, improved independently of the new features offered by more recent major or minor releases. Bugfix releases, indicated by a third component of the version number, only fix known problems and do not gratuitously introduce new and possibly incompatible features or modified library functionality. +

+Release 2.2 got its first bugfix on April 10, 2002. The new version +number is now 2.2.1. The 2.1 release, at 2.1.3, can probably be +considered the "most stable" platform because it has been bugfixed +twice. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jul 23 10:20:04 2002 by +Jens Kubieziel +

+ +


+

2.5. What new developments are expected for Python in the future?

+See http://www.python.org/peps/ for the Python Enhancement +Proposals (PEPs). PEPs are design +documents +describing a suggested new feature for Python, providing +a concise technical specification and a rationale. +

+Also, follow the discussions on the python-dev mailing list. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Apr 9 17:09:51 2002 by +A.M. Kuchling +

+ +


+

2.6. Is it reasonable to propose incompatible changes to Python?

+In general, no. There are already millions of lines of Python code +around the world, so any changes in the language that invalidates more +than a very small fraction of existing programs has to be frowned +upon. Even if you can provide a conversion program, there still is +the problem of updating all documentation. Providing a gradual +upgrade path is the only way if a feature has to be changed. +

+See http://www.python.org/peps/pep-0005.html for the proposed +mechanism for creating backwards-incompatibilities. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Apr 1 22:13:47 2002 by +Fred Drake +

+ +


+

2.7. What is the future of Python?

+Please see http://www.python.org/peps/ for proposals of future +activities. One of the PEPs (Python Enhancement Proposals) deals +with the PEP process and PEP format -- see +http://www.python.org/peps/pep-0001.html if you want to +submit a PEP. In http://www.python.org/peps/pep-0042.html there +is a list of wishlists the Python Development team plans to tackle. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Apr 1 22:15:46 2002 by +Fred Drake +

+ +


+

2.8. What was the PSA, anyway?

+The Python Software Activity was +created by a number of Python aficionados who want Python to be more +than the product and responsibility of a single individual. +The PSA was not an independent organization, but lived +under the umbrealla of CNRI. +

+The PSA has been superseded by the Python Software Foundation, +an independent non-profit organization. The PSF's home page +is at http://www.python.org/psf/. +

+Some pages created by the PSA still live at +http://www.python.org/psa/ +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jul 25 18:19:44 2002 by +GvR +

+ +


+

2.9. Deleted

+

+

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 02:51:30 2001 by +Moshe Zadka +

+ +


+

2.10. Deleted

+

+

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 02:52:19 2001 by +Moshe Zadka +

+ +


+

2.11. Is Python Y2K (Year 2000) Compliant?

+As of January, 2001 no major problems have been reported and Y2K +compliance seems to be a non-issue. +

+Since Python is available free of charge, there are no absolute +guarantees. If there are unforeseen problems, liability is the +user's rather than the developers', and there is nobody you can sue for damages. +

+Python does few +date manipulations, and what it does is all based on the Unix +representation for time (even on non-Unix systems) which uses seconds +since 1970 and won't overflow until 2038. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jan 8 17:19:32 2001 by +Steve Holden +

+ +


+

2.12. Is Python a good language in a class for beginning programmers?

+Yes. This long answer attempts to address any concerns you might +have with teaching Python as a programmer's first language. +(If you want to discuss Python's use in education, then +you may be interested in joining the edu-sig mailinglist. +See http://www.python.org/sigs/edu-sig/ ) +

+It is still common to start students with a procedural +(subset of a) statically typed language such as Pascal, C, or +a subset of C++ or Java. I think that students may be better +served by learning Python as their first language. Python has +a very simple and consistent syntax and a large standard library. +Most importantly, using Python in a beginning programming course +permits students to concentrate on important programming skills, +such as problem decomposition and data type design. +

+With Python, students can be quickly introduced to basic concepts +such as loops and procedures. They can even probably work with +user-defined objects in their very first course. They could +implement a tree structure as nested Python lists, for example. +They could be introduced to objects in their first course if +desired. For a student who has never programmed before, using +a statically typed language seems unnatural. It presents +additional complexity that the student must master and slows +the pace of the course. The students are trying to learn to +think like a computer, decompose problems, design consistent +interfaces, and encapsulate data. While learning to use a +statically typed language is important, it is not necessarily the +best topic to address in the students' first programming course. +

+Many other aspects of Python make it a good first language. +Python has a large standard library (like Java) so that +students can be assigned programming projects very early in the +course that do something. Assignments aren't restricted to the +standard four-function calculator and check balancing programs. +By using the standard library, students can gain the satisfaction +of working on realistic applications as they learn the fundamentals +of programming. Using the standard library also teaches students +about code reuse. +

+Python's interactive interpreter also enables students to +test language features while they're programming. They can keep +a window with the interpreter running while they enter their +programs' source in another window. If they can't remember the +methods for a list, they can do something like this: +

+

+ >>> L = []
+ >>> dir(L)
+ ['append', 'count', 'extend', 'index', 'insert', 'pop', 'remove',
+ 'reverse', 'sort']
+ >>> print L.append.__doc__
+ L.append(object) -- append object to end
+ >>> L.append(1)
+ >>> L
+ [1]
+
+With the interpreter, documentation is never far from the +student as he's programming. +

+There are also good IDEs for Python. Guido van Rossum's IDLE +is a cross-platform IDE for Python that is written in Python +using Tk. There is also a Windows specific IDE called PythonWin. +Emacs users will be happy to know that there is a very good Python +mode for Emacs. All of these programming environments provide +syntax highlighting, auto-indenting, and access to the interactive +interpreter while coding. For more information about IDEs, see XXX. +

+If your department is currently using Pascal because it was +designed to be a teaching language, then you'll be happy to +know that Guido van Rossum designed Python to be simple to +teach to everyone but powerful enough to implement real world +applications. Python makes a good language for first time +programmers because that was one of Python's design goals. +There are papers at http://www.python.org/doc/essays/ on the Python website +by Python's creator explaining his objectives for the language. +One that may interest you is titled "Computer Programming for Everybody" +http://www.python.org/doc/essays/cp4e.html +

+If you're seriously considering Python as a language for your +school, Guido van Rossum may even be willing to correspond with +you about how the language would fit in your curriculum. +See http://www.python.org/doc/FAQ.html#2.2 for examples of +Python's use in the "real world." +

+While Python, its source code, and its IDEs are freely +available, this consideration should not rule +out other languages. There are other free languages (Java, +free C compilers), and many companies are willing to waive some +or all of their fees for student programming tools if it +guarantees that a whole graduating class will know how to +use their tools. That is, if one of the requirements for +the language that will be taught is that it be freely +available, then Python qualifies, but this requirement +does not preclude other languages. +

+While Python jobs may not be as prevalent as C/C++/Java jobs, +teachers should not worry about teaching students critical job +skills in their first course. The skills that win students a +job are those they learn in their senior classes and internships. +Their first programming courses are there to lay a solid +foundation in programming fundamentals. The primary question +in choosing the language for such a course should be which +language permits the students to learn this material without +hindering or limiting them. +

+Another argument for Python is that there are many tasks for +which something like C++ is overkill. That's where languages +like Python, Perl, Tcl, and Visual Basic thrive. It's critical +for students to know something about these languages. (Every +employer for whom I've worked used at least one such language.) +Of the languages listed above, Python probably makes the best +language in a programming curriculum since its syntax is simple, +consistent, and not unlike other languages (C/C++/Java) that +are probably in the curriculum. By starting students with +Python, a department simultaneously lays the foundations for +other programming courses and introduces students to the type +of language that is often used as a "glue" language. As an +added bonus, Python can be used to interface with Microsoft's +COM components (thanks to Mark Hammond). There is also Jython, +a Java implementation of the Python interpreter, that can be +used to connect Java components. +

+If you currently start students with Pascal or C/C++ or Java, +you may be worried they will have trouble learning a statically +typed language after starting with Python. I think that this +fear most often stems from the fact that the teacher started +with a statically typed language, and we tend to like to teach +others in the same way we were taught. In reality, the +transition from Python to one of these other languages is +quite simple. +

+To motivate a statically typed language such as C++, begin the +course by explaining that unlike Python, their first language, +C++ is compiled to a machine dependent executable. Explain +that the point is to make a very fast executable. To permit +the compiler to make optimizations, programmers must help it +by specifying the "types" of variables. By restricting each +variable to a specific type, the compiler can reduce the +book-keeping it has to do to permit dynamic types. The compiler +also has to resolve references at compile time. Thus, the +language gains speed by sacrificing some of Python's dynamic +features. Then again, the C++ compiler provides type safety +and catches many bugs at compile time instead of run time (a +critical consideration for many commercial applications). C++ +is also designed for very large programs where one may want to +guarantee that others don't touch an object's implementation. +C++ provides very strong language features to separate an object's +implementation from its interface. Explain why this separation +is a good thing. +

+The first day of a C++ course could then be a whirlwind introduction +to what C++ requires and provides. The point here is that after +a semester or two of Python, students are hopefully competent +programmers. They know how to handle loops and write procedures. +They've also worked with objects, thought about the benefits of +consistent interfaces, and used the technique of subclassing to +specialize behavior. Thus, a whirlwind introduction to C++ could +show them how objects and subclassing looks in C++. The +potentially difficult concepts of object-oriented design were +taught without the additional obstacles presented by a language +such as C++ or Java. When learning one of these languages, +the students would already understand the "road map." They +understand objects; they would just be learning how objects +fit in a statically typed languages. Language requirements +and compiler errors that seem unnatural to beginning programmers +make sense in this new context. Many students will find it +helpful to be able to write a fast prototype of their algorithms +in Python. Thus, they can test and debug their ideas before +they attempt to write the code in the new language, saving the +effort of working with C++ types for when they've discovered a +working solution for their assignments. When they get annoyed +with the rigidity of types, they'll be happy to learn about +containers and templates to regain some of the lost flexibility +Python afforded them. Students may also gain an appreciation +for the fact that no language is best for every task. They'll +see that C++ is faster, but they'll know that they can gain +flexibility and development speed with a Python when execution +speed isn't critical. +

+If you have any concerns that weren't addressed here, try +posting to the Python newsgroup. Others there have done some +work with using Python as an instructional tool. Good luck. +We'd love to hear about it if you choose Python for your course. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Dec 2 19:32:35 2002 by +Bill Sconce +

+ +


+

3. Building Python and Other Known Bugs

+ +
+

3.1. Is there a test set?

+Sure. You can run it after building with "make test", or you can +run it manually with this command at the Python prompt: +

+

+ import test.autotest
+
+In Python 1.4 or earlier, use +

+

+ import autotest
+
+The test set doesn't test all features of Python, +but it goes a long way to confirm that Python is actually working. +

+NOTE: if "make test" fails, don't just mail the output to the +newsgroup -- this doesn't give enough information to debug the +problem. Instead, find out which test fails, and run that test +manually from an interactive interpreter. For example, if +"make test" reports that test_spam fails, try this interactively: +

+

+ import test.test_spam
+
+This generally produces more verbose output which can be diagnosed +to debug the problem. If you find a bug in Python or the libraries, or in the tests, please report this in the Python bug tracker at SourceForge: +

+http://sourceforge.net/tracker/?func=add&group_id=5470&atid=105470 +

+ +Edit this entry / +Log info + +/ Last changed on Fri Apr 27 10:29:36 2001 by +Fred Drake +

+ +


+

3.2. When running the test set, I get complaints about floating point operations, but when playing with floating point operations I cannot find anything wrong with them.

+The test set makes occasional unwarranted assumptions about the +semantics of C floating point operations. Until someone donates a +better floating point test set, you will have to comment out the +offending floating point tests and execute similar tests manually. +

+ +Edit this entry / +Log info +

+ +


+

3.3. Link errors after rerunning the configure script.

+It is generally necessary to run "make clean" after a configuration +change. +

+ +Edit this entry / +Log info +

+ +


+

3.4. The python interpreter complains about options passed to a script (after the script name).

+You are probably linking with GNU getopt, e.g. through -liberty. +Don't. The reason for the complaint is that GNU getopt, unlike System +V getopt and other getopt implementations, doesn't consider a +non-option to be the end of the option list. A quick (and compatible) +fix for scripts is to add "--" to the interpreter, like this: +

+

+        #! /usr/local/bin/python --
+
+You can also use this interactively: +

+

+        python -- script.py [options]
+
+Note that a working getopt implementation is provided in the Python +distribution (in Python/getopt.c) but not automatically used. +

+ +Edit this entry / +Log info +

+ +


+

3.5. When building on the SGI, make tries to run python to create glmodule.c, but python hasn't been built or installed yet.

+Comment out the line mentioning glmodule.c in Setup and build a +python without gl first; install it or make sure it is in your $PATH, +then edit the Setup file again to turn on the gl module, and make +again. You don't need to do "make clean"; you do need to run "make +Makefile" in the Modules subdirectory (or just run "make" at the +toplevel). +

+ +Edit this entry / +Log info +

+ +


+

3.6. I use VPATH but some targets are built in the source directory.

+On some systems (e.g. Sun), if the target already exists in the +source directory, it is created there instead of in the build +directory. This is usually because you have previously built without +VPATH. Try running "make clobber" in the source directory. +

+ +Edit this entry / +Log info +

+ +


+

3.7. Trouble building or linking with the GNU readline library.

+You can use the GNU readline library to improve the interactive user +interface: this gives you line editing and command history when +calling python interactively. Its sources are distributed with +Python (at least for 2.0). Uncomment the line +

+#readline readline.c -lreadline -ltermcap +

+in Modules/Setup. The configuration option --with-readline +is no longer supported, at least in Python 2.0. Some hints on +building and using the readline library: +On SGI IRIX 5, you may have to add the following +to rldefs.h: +

+

+        #ifndef sigmask
+        #define sigmask(sig) (1L << ((sig)-1))
+        #endif
+
+On some systems, you will have to add #include "rldefs.h" to the +top of several source files, and if you use the VPATH feature, you +will have to add dependencies of the form foo.o: foo.c to the +Makefile for several values of foo. +The readline library requires use of the termcap library. A +known problem with this is that it contains entry points which +cause conflicts with the STDWIN and SGI GL libraries. The STDWIN +conflict can be solved by adding a line saying '#define werase w_erase' to the +stdwin.h file (in the STDWIN distribution, subdirectory H). The +GL conflict has been solved in the Python configure script by a +hack that forces use of the static version of the termcap library. +Check the newsgroup gnu.bash.bug news:gnu.bash.bug for +specific problems with the readline library (I don't read this group +but I've been told that it is the place for readline bugs). +

+ +Edit this entry / +Log info + +/ Last changed on Sat Dec 2 18:23:48 2000 by +Issac Trotts +

+ +


+

3.8. Trouble with socket I/O on older Linux 1.x versions.

+Once you've built Python, use it to run the regen script in the +Lib/plat-linux2 directory. Apparently the files as distributed don't match the system headers on some Linux versions. +

+Note that this FAQ entry only applies to Linux kernel versions 1.x.y; +these are hardly around any more. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jul 30 20:05:52 2002 by +Jens Kubieziel +

+ +


+

3.9. Trouble with prototypes on Ultrix.

+Ultrix cc seems broken -- use gcc, or edit config.h to #undef +HAVE_PROTOTYPES. +

+ +Edit this entry / +Log info +

+ +


+

3.10. Other trouble building Python on platform X.

+Please submit the details to the SourceForge bug tracker: +

+

+  http://sourceforge.net/tracker/?group_id=5470&atid=105470
+
+and we'll look +into it. Please provide as many details as possible. In particular, +if you don't tell us what type of computer and what operating system +(and version) you are using it will be difficult for us to figure out +what is the matter. If you have compilation output logs, +please use file uploads -- don't paste everything in the message box. +

+In many cases, we won't have access to the same hardware or operating system version, so please, if you have a SourceForge account, log in before filing your report, or if you don't have an account, include an email address at which we can reach you for further questions. Logging in to SourceForge first will also cause SourceForge to send you updates as we act on your report. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Apr 27 10:53:18 2001 by +Fred Drake +

+ +


+

3.11. How to configure dynamic loading on Linux.

+This is now automatic as long as your Linux version uses the ELF +object format (all recent Linuxes do). +

+ +Edit this entry / +Log info +

+ +


+

3.12. I can't get shared modules to work on Linux 2.0 (Slackware96)?

+This is a bug in the Slackware96 release. The fix is simple: Make sure +that there is a link from /lib/libdl.so to /lib/libdl.so.1 so that the +following links are setup: /lib/libdl.so -> /lib/libdl.so.1 +/lib/libdl.so.1 -> /lib/libdl.so.1.7.14 You may have to rerun the +configure script, after rm'ing the config.cache file, before you +attempt to rebuild python after this fix. +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 15:45:03 1997 by +GvR +

+ +


+

3.13. Trouble when making modules shared on Linux.

+This happens when you have built Python for static linking and then +enable +
+  *shared*
+
+in the Setup file. Shared library code must be +compiled with "-fpic". If a .o file for the module already exist that +was compiled for static linking, you must remove it or do "make clean" +in the Modules directory. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 13:42:30 1997 by +GvR +

+ +


+

3.14. [deleted]

+[ancient information on threads on linux (when thread support +was not standard) used to be here] +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 2 17:27:13 2002 by +Erno Kuusela +

+ +


+

3.15. Errors when linking with a shared library containing C++ code.

+Link the main Python binary with C++. Change the definition of +LINKCC in Modules/Makefile to be your C++ compiler. You may have to +edit config.c slightly to make it compilable with C++. +

+ +Edit this entry / +Log info +

+ +


+

3.16. Deleted

+

+

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 11 16:02:22 2001 by +GvR +

+ +


+

3.17. Deleted.

+

+

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 11 15:54:57 2001 by +GvR +

+ +


+

3.18. Compilation or link errors for the _tkinter module

+Most likely, there's a version mismatch between the Tcl/Tk header +files (tcl.h and tk.h) and the Tcl/Tk libraries you are using e.g. +"-ltk8.0" and "-ltcl8.0" arguments for _tkinter in the Setup file). +It is possible to install several versions of the Tcl/Tk libraries, +but there can only be one version of the tcl.h and tk.h header +files. If the library doesn't match the header, you'll get +problems, either when linking the module, or when importing it. +Fortunately, the version number is clearly stated in each file, +so this is easy to find. Reinstalling and using the latest +version usually fixes the problem. +

+(Also note that when compiling unpatched Python 1.5.1 against +Tcl/Tk 7.6/4.2 or older, you get an error on Tcl_Finalize. See +the 1.5.1 patch page at http://www.python.org/1.5/patches-1.5.1/.) +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jun 11 00:49:14 1998 by +Gvr +

+ +


+

3.19. I configured and built Python for Tcl/Tk but "import Tkinter" fails.

+Most likely, you forgot to enable the line in Setup that says +"TKPATH=:$(DESTLIB)/tkinter". +

+ +Edit this entry / +Log info +

+ +


+

3.20. [deleted]

+[ancient information on a gcc+tkinter bug on alpha was here] +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 16:46:23 2002 by +Erno Kuusela +

+ +


+

3.21. Several common system calls are missing from the posix module.

+Most likely, all test compilations run by the configure script +are failing for some reason or another. Have a look in config.log to +see what could be the reason. A common reason is specifying a +directory to the --with-readline option that doesn't contain the +libreadline.a file. +

+ +Edit this entry / +Log info +

+ +


+

3.22. ImportError: No module named string, on MS Windows.

+Most likely, your PYTHONPATH environment variable should be set to +something like: +

+set PYTHONPATH=c:\python;c:\python\lib;c:\python\scripts +

+(assuming Python was installed in c:\python) +

+ +Edit this entry / +Log info +

+ +


+

3.23. Core dump on SGI when using the gl module.

+There are conflicts between entry points in the termcap and curses +libraries and an entry point in the GL library. There's a hack of a +fix for the termcap library if it's needed for the GNU readline +library, but it doesn't work when you're using curses. Concluding, +you can't build a Python binary containing both the curses and gl +modules. +

+ +Edit this entry / +Log info +

+ +


+

3.24. "Initializer not a constant" while building DLL on MS-Windows

+Static type object initializers in extension modules may cause compiles to +fail with an error message like "initializer not a constant". +Fredrik Lundh <Fredrik.Lundh@image.combitech.se> explains: +

+This shows up when building DLL under MSVC. There's two ways to +address this: either compile the module as C++, or change your code to +something like: +

+

+  statichere PyTypeObject bstreamtype = {
+      PyObject_HEAD_INIT(NULL) /* must be set by init function */
+      0,
+      "bstream",
+      sizeof(bstreamobject),
+
+
+  ...
+
+
+  void
+  initbstream()
+  {
+      /* Patch object type */
+      bstreamtype.ob_type = &PyType_Type;
+      Py_InitModule("bstream", functions);
+      ...
+  }
+
+

+ +Edit this entry / +Log info + +/ Last changed on Sun May 25 14:58:05 1997 by +Aaron Watters +

+ +


+

3.25. Output directed to a pipe or file disappears on Linux.

+Some people have reported that when they run their script +interactively, it runs great, but that when they redirect it +to a pipe or file, no output appears. +

+

+    % python script.py
+    ...some output...
+    % python script.py >file
+    % cat file
+    % # no output
+    % python script.py | cat
+    % # no output
+    %
+
+This was a bug in Linux kernel. It is fixed and should not appear anymore. So most Linux users are not affected by this. +

+If redirection doesn't work on your Linux system, check what shell you are using. Shells like (t)csh doesn't support redirection. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jan 16 13:38:30 2003 by +Jens Kubieziel +

+ +


+

3.26. [deleted]

+[ancient libc/linux problem was here] +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 16:48:08 2002 by +Erno Kuusela +

+ +


+

3.27. [deleted]

+[ancient linux + threads + tk problem was described here] +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 16:49:08 2002 by +Erno Kuusela +

+ +


+

3.28. How can I test if Tkinter is working?

+Try the following: +

+

+  python
+  >>> import _tkinter
+  >>> import Tkinter
+  >>> Tkinter._test()
+
+This should pop up a window with two buttons, +one "Click me" and one "Quit". +

+If the first statement (import _tkinter) fails, your Python +installation probably has not been configured to support Tcl/Tk. +On Unix, if you have installed Tcl/Tk, you have to rebuild Python +after editing the Modules/Setup file to enable the _tkinter module +and the TKPATH environment variable. +

+It is also possible to get complaints about Tcl/Tk version +number mismatches or missing TCL_LIBRARY or TK_LIBRARY +environment variables. These have to do with Tcl/Tk installation +problems. +

+A common problem is to have installed versions of tcl.h and tk.h +that don't match the installed version of the Tcl/Tk libraries; +this usually results in linker errors or (when using dynamic +loading) complaints about missing symbols during loading +the shared library. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Aug 28 17:01:46 1997 by +Guido van Rossum +

+ +


+

3.29. Is there a way to get the interactive mode of the python interpreter to perform function/variable name completion?

+(From a posting by Guido van Rossum) +

+On Unix, if you have enabled the readline module (i.e. if Emacs-style +command line editing and bash-style history works for you), you can +add this by importing the undocumented standard library module +"rlcompleter". When completing a simple identifier, it +completes keywords, built-ins and globals in __main__; when completing +NAME.NAME..., it evaluates (!) the expression up to the last dot and +completes its attributes. +

+This way, you can do "import string", type "string.", hit the +completion key twice, and see the list of names defined by the +string module. +

+Tip: to use the tab key as the completion key, call +

+

+    readline.parse_and_bind("tab: complete")
+
+You can put this in a ~/.pythonrc file, and set the PYTHONSTARTUP +environment variable to ~/.pythonrc. This will cause the completion to be enabled +whenever you run Python interactively. +

+Notes (see the docstring for rlcompleter.py for more information): +

+* The evaluation of the NAME.NAME... form may cause arbitrary +application defined code to be executed if an object with a +__getattr__ hook is found. Since it is the responsibility of the +application (or the user) to enable this feature, I consider this an +acceptable risk. More complicated expressions (e.g. function calls or +indexing operations) are not evaluated. +

+* GNU readline is also used by the built-in functions input() and +raw_input(), and thus these also benefit/suffer from the complete +features. Clearly an interactive application can benefit by +specifying its own completer function and using raw_input() for all +its input. +

+* When stdin is not a tty device, GNU readline is never +used, and this module (and the readline module) are silently inactive. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jun 12 09:55:24 1998 by +A.M. Kuchling +

+ +


+

3.30. Why is the Python interpreter not built as a shared library?

+(This is a Unix question; on Mac and Windows, it is a shared +library.) +

+It's just a nightmare to get this to work on all different platforms. +Shared library portability is a pain. And yes, I know about GNU libtool +-- but it requires me to use its conventions for filenames etc, and it +would require a complete and utter rewrite of all the makefile and +config tools I'm currently using. +

+In practice, few applications embed Python -- it's much more common to +have Python extensions, which already are shared libraries. Also, +serious embedders often want total control over which Python version +and configuration they use so they wouldn't want to use a standard +shared library anyway. So while the motivation of saving space +when lots of apps embed Python is nice in theory, I +doubt that it will save much in practice. (Hence the low priority I +give to making a shared library.) +

+For Linux systems, the simplest method of producing libpython1.5.so seems to +be (originally from the Minotaur project web page, +http://www.equi4.com/minotaur/minotaur.html): +

+

+  make distclean 
+  ./configure 
+  make OPT="-fpic -O2" 
+  mkdir .extract 
+  (cd .extract; ar xv ../libpython1.5.a) 
+  gcc -shared -o libpython1.5.so .extract/*.o 
+  rm -rf .extract
+
+In Python 2.3 this will be supported by the standard build routine +(at least on Linux) with --enable-shared. Note however that there +is little advantage, and it slows down Python because of the need +for PIC code and the extra cost at startup time to find the library. +

+ +Edit this entry / +Log info + +/ Last changed on Thu May 30 13:36:55 2002 by +GvR +

+ +


+

3.31. Build with GCC on Solaris 2.6 (SunOS 5.6) fails

+If you have upgraded Solaris 2.5 or 2.5.1 to Solaris 2.6, +but you have not upgraded +your GCC installation, the compile may fail, e.g. like this: +

+

+ In file included from /usr/include/sys/stream.h:26,
+                  from /usr/include/netinet/in.h:38,
+                  from /usr/include/netdb.h:96,
+                  from ./socketmodule.c:121:
+ /usr/include/sys/model.h:32: #error "No DATAMODEL_NATIVE specified"
+
+Solution: rebuild GCC for Solaris 2.6. +You might be able to simply re-run fixincludes, but +people have had mixed success with doing that. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Oct 21 11:18:46 1998 by +GvR +

+ +


+

3.32. Running "make clean" seems to leave problematic files that cause subsequent builds to fail.

+Use "make clobber" instead. +

+Use "make clean" to reduce the size of the source/build directory +after you're happy with your build and installation. +If you have already tried to build python and you'd like to start +over, you should use "make clobber". It does a "make clean" and also +removes files such as the partially built Python library from a previous build. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jun 24 20:39:26 1999 by +TAB +

+ +


+

3.33. Submitting bug reports and patches

+To report a bug or submit a patch, please use the relevant service +from the Python project at SourceForge. +

+Bugs: http://sourceforge.net/tracker/?group_id=5470&atid=105470 +

+Patches: http://sourceforge.net/tracker/?group_id=5470&atid=305470 +

+If you have a SourceForge account, please log in before submitting your bug report; this will make it easier for us to contact you regarding your report in the event we have follow-up questions. It will also enable SourceForge to send you update information as we act on your bug. If you do not have a SourceForge account, please consider leaving your name and email address as part of the report. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Apr 27 10:58:26 2001 by +Fred Drake +

+ +


+

3.34. I can't load shared libraries under Python 1.5.2, Solaris 7, and gcc 2.95.2

+When trying to load shared libraries, you may see errors like: +ImportError: ld.so.1: python: fatal: relocation error: file /usr/local/lib/python1.5/site-packages/Perp/util/du_SweepUtilc.so: +
+ symbol PyExc_RuntimeError: referenced symbol not found
+
+

+There is a problem with the configure script for Python 1.5.2 +under Solaris 7 with gcc 2.95 . configure should set the make variable +LINKFORSHARED=-Xlinker -export-dynamic +

+

+in Modules/Makefile, +

+Manually add this line to the Modules/Makefile. +This builds a Python executable that can load shared library extensions (xxx.so) . +

+ +Edit this entry / +Log info + +/ Last changed on Mon Feb 19 10:37:05 2001 by +GvR +

+ +


+

3.35. In the regression test, test___all__ fails for the profile module. What's wrong?

+If you have been using the profile module, and have properly calibrated a copy of the module as described in the documentation for the profiler: +

+http://www.python.org/doc/current/lib/profile-calibration.html +

+then it is possible that the regression test "test___all__" will fail if you run the regression test manually rather than using "make test" in the Python source directory. This will happen if you have set your PYTHONPATH environment variable to include the directory containing your calibrated profile module. You have probably calibrated the profiler using an older version of the profile module which does not define the __all__ value, added to the module as of Python 2.1. +

+The problem can be fixed by removing the old calibrated version of the profile module and using the latest version to do a fresh calibration. In general, you will need to re-calibrate for each version of Python anyway, since the performance characteristics can change in subtle ways that impact profiling. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Apr 27 10:44:10 2001 by +Fred Drake +

+ +


+

3.36. relocations remain against allocatable but non-writable sections

+This linker error occurs on Solaris if you attempt to build an extension module which incorporates position-dependent (non-PIC) code. A common source of problems is that a static library (.a file), such as libreadline.a or libcrypto.a is linked with the extension module. The error specifically occurs when using gcc as the compiler, but /usr/ccs/bin/ld as the linker. +

+The following solutions and work-arounds are known: +

+1. Rebuild the libraries (libreadline, libcrypto) with -fPIC (-KPIC if using the system compiler). This is recommended; all object files in a shared library should be position-independent. +

+2. Statically link the extension module and its libraries into the Python interpreter, by editing Modules/Setup. +

+3. Use GNU ld instead of /usr/ccs/bin/ld; GNU ld will accept non-PIC code in shared libraries (and mark the section writable) +

+4. Pass -mimpure-text to GCC when linking the module. This will force gcc to not pass -z text to ld; in turn, ld will make all text sections writable. +

+Options 3 and 4 are not recommended, since the ability to share code across processes is lost. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 29 12:05:11 2002 by +Martin v. Löwis +

+ +


+

4. Programming in Python

+ +
+

4.1. Is there a source code level debugger with breakpoints, step, etc.?

+Yes. +

+Module pdb is a rudimentary but adequate console-mode debugger for Python. It is part of the standard Python library, and is documented in the Library Reference Manual. (You can also write your own debugger by using the code for pdb as an example.) +

+The IDLE interactive development environment, which is part of the standard Python distribution (normally available in Tools/idle), includes a graphical debugger. There is documentation for the IDLE debugger at http://www.python.org/idle/doc/idle2.html#Debugger +

+Pythonwin is a Python IDE that includes a GUI debugger based on bdb. The Pythonwin debugger colors breakpoints and has quite a few cool features (including debugging non-Pythonwin programs). A reference can be found at http://www.python.org/ftp/python/pythonwin/pwindex.html +More recent versions of PythonWin are available as a part of the ActivePython distribution (see http://www.activestate.com/Products/ActivePython/index.html). +

+Pydb is a version of the standard Python debugger pdb, modified for use with DDD (Data Display Debugger), a popular graphical debugger front end. Pydb can be found at http://packages.debian.org/unstable/devel/pydb.html +and DDD can be found at http://www.gnu.org/software/ddd/ +

+There are a number of commmercial Python IDEs that include graphical debuggers. They include: +

+

+ * Wing IDE (http://wingide.com/) 
+ * Komodo IDE (http://www.activestate.com/Products/Komodo/)
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 28 01:43:41 2003 by +Stephen Ferg +

+ +


+

4.2. Can I create an object class with some methods implemented in C and others in Python (e.g. through inheritance)? (Also phrased as: Can I use a built-in type as base class?)

+In Python 2.2, you can inherit from builtin classes such as int, list, dict, etc. +

+In previous versions of Python, you can easily create a Python class which serves as a wrapper around a built-in object, e.g. (for dictionaries): +

+

+        # A user-defined class behaving almost identical
+        # to a built-in dictionary.
+        class UserDict:
+                def __init__(self): self.data = {}
+                def __repr__(self): return repr(self.data)
+                def __cmp__(self, dict):
+                        if type(dict) == type(self.data):
+                                return cmp(self.data, dict)
+                        else:
+                                return cmp(self.data, dict.data)
+                def __len__(self): return len(self.data)
+                def __getitem__(self, key): return self.data[key]
+                def __setitem__(self, key, item): self.data[key] = item
+                def __delitem__(self, key): del self.data[key]
+                def keys(self): return self.data.keys()
+                def items(self): return self.data.items()
+                def values(self): return self.data.values()
+                def has_key(self, key): return self.data.has_key(key)
+
+A2. See Jim Fulton's ExtensionClass for an example of a mechanism +which allows you to have superclasses which you can inherit from in +Python -- that way you can have some methods from a C superclass (call +it a mixin) and some methods from either a Python superclass or your +subclass. ExtensionClass is distributed as a part of Zope (see +http://www.zope.org), but will be phased out with Zope 3, since +Zope 3 uses Python 2.2 or later which supports direct inheritance +from built-in types. Here's a link to the original paper about +ExtensionClass: +http://debian.acm.ndsu.nodak.edu/doc/python-extclass/ExtensionClass.html +

+A3. The Boost Python Library (BPL, http://www.boost.org/libs/python/doc/index.html) +provides a way of doing this from C++ (i.e. you can inherit from an +extension class written in C++ using the BPL). +

+ +Edit this entry / +Log info + +/ Last changed on Tue May 28 21:09:52 2002 by +GvR +

+ +


+

4.3. Is there a curses/termcap package for Python?

+The standard Python source distribution comes with a curses module in +the Modules/ subdirectory, though it's not compiled by default (note +that this is not available in the Windows distribution -- there is +no curses module for Windows). +

+In Python versions before 2.0 the module only supported plain curses; +you couldn't use ncurses features like colors with it (though it would +link with ncurses). +

+In Python 2.0, the curses module has been greatly extended, starting +from Oliver Andrich's enhanced version, to provide many additional +functions from ncurses and SYSV curses, such as colour, alternative +character set support, pads, and mouse support. This means the +module is no longer compatible with operating systems that only +have BSD curses, but there don't seem to be any currently +maintained OSes that fall into this category. +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 23 20:24:06 2002 by +Tim Peters +

+ +


+

4.4. Is there an equivalent to C's onexit() in Python?

+For Python 2.0: The new atexit module provides a register function that +is similar to C's onexit. See the Library Reference for details. For +2.0 you should not assign to sys.exitfunc! +

+For Python 1.5.2: You need to import sys and assign a function to +sys.exitfunc, it will be called when your program exits, is +killed by an unhandled exception, or (on UNIX) receives a +SIGHUP or SIGTERM signal. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 12:14:55 2000 by +Bjorn Pettersen +

+ +


+

4.5. [deleted]

+[python used to lack nested scopes, it was explained here] +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 05:18:22 2002 by +Erno Kuusela +

+ +


+

4.6. How do I iterate over a sequence in reverse order?

+If it is a list, the fastest solution is +

+

+        list.reverse()
+        try:
+                for x in list:
+                        "do something with x"
+        finally:
+                list.reverse()
+
+This has the disadvantage that while you are in the loop, the list +is temporarily reversed. If you don't like this, you can make a copy. +This appears expensive but is actually faster than other solutions: +

+

+        rev = list[:]
+        rev.reverse()
+        for x in rev:
+                <do something with x>
+
+If it's not a list, a more general but slower solution is: +

+

+        for i in range(len(sequence)-1, -1, -1):
+                x = sequence[i]
+                <do something with x>
+
+A more elegant solution, is to define a class which acts as a sequence +and yields the elements in reverse order (solution due to Steve +Majewski): +

+

+        class Rev:
+                def __init__(self, seq):
+                        self.forw = seq
+                def __len__(self):
+                        return len(self.forw)
+                def __getitem__(self, i):
+                        return self.forw[-(i + 1)]
+
+You can now simply write: +

+

+        for x in Rev(list):
+                <do something with x>
+
+Unfortunately, this solution is slowest of all, due to the method +call overhead... +

+ +Edit this entry / +Log info + +/ Last changed on Sun May 25 21:10:50 1997 by +GvR +

+ +


+

4.7. My program is too slow. How do I speed it up?

+That's a tough one, in general. There are many tricks to speed up +Python code; I would consider rewriting parts in C only as a last +resort. One thing to notice is that function and (especially) method +calls are rather expensive; if you have designed a purely OO interface +with lots of tiny functions that don't do much more than get or set an +instance variable or call another method, you may consider using a +more direct way, e.g. directly accessing instance variables. Also see +the standard module "profile" (described in the Library Reference +manual) which makes it possible to find out where +your program is spending most of its time (if you have some patience +-- the profiling itself can slow your program down by an order of +magnitude). +

+Remember that many standard optimization heuristics you +may know from other programming experience may well apply +to Python. For example it may be faster to send output to output +devices using larger writes rather than smaller ones in order to +avoid the overhead of kernel system calls. Thus CGI scripts +that write all output in "one shot" may be notably faster than +those that write lots of small pieces of output. +

+Also, be sure to use "aggregate" operations where appropriate. +For example the "slicing" feature allows programs to chop up +lists and other sequence objects in a single tick of the interpreter +mainloop using highly optimized C implementations. Thus to +get the same effect as +

+

+  L2 = []
+  for i in range[3]:
+       L2.append(L1[i])
+
+it is much shorter and far faster to use +

+

+  L2 = list(L1[:3]) # "list" is redundant if L1 is a list.
+
+Note that the map() function, particularly used with +builtin methods or builtin functions can be a convenient +accelerator. For example to pair the elements of two +lists together: +

+

+  >>> map(None, [1,2,3], [4,5,6])
+  [(1, 4), (2, 5), (3, 6)]
+
+or to compute a number of sines: +

+

+  >>> map( math.sin, (1,2,3,4))
+  [0.841470984808, 0.909297426826, 0.14112000806,   -0.756802495308]
+
+The map operation completes very quickly in such cases. +

+Other examples of aggregate operations include the join and split +methods of string objects. For example if s1..s7 are large (10K+) strings then +"".join([s1,s2,s3,s4,s5,s6,s7]) may be far faster than +the more obvious s1+s2+s3+s4+s5+s6+s7, since the "summation" +will compute many subexpressions, whereas join does all +copying in one pass. For manipulating strings also consider the +regular expression libraries and the "substitution" operations +String % tuple and String % dictionary. Also be sure to use +the list.sort builtin method to do sorting, and see FAQ's 4.51 +and 4.59 for examples of moderately advanced usage -- list.sort beats +other techniques for sorting in all but the most extreme +circumstances. +

+There are many other aggregate operations +available in the standard libraries and in contributed libraries +and extensions. +

+Another common trick is to "push loops into functions or methods." +For example suppose you have a program that runs slowly and you +use the profiler (profile.run) to determine that a Python function ff +is being called lots of times. If you notice that ff +

+

+   def ff(x):
+       ...do something with x computing result...
+       return result
+
+tends to be called in loops like (A) +

+

+   list = map(ff, oldlist)
+
+or (B) +

+

+   for x in sequence:
+       value = ff(x)
+       ...do something with value...
+
+then you can often eliminate function call overhead by rewriting +ff to +

+

+   def ffseq(seq):
+       resultseq = []
+       for x in seq:
+           ...do something with x computing result...
+           resultseq.append(result)
+       return resultseq
+
+and rewrite (A) to +

+

+    list = ffseq(oldlist)
+
+and (B) to +

+

+    for value in ffseq(sequence):
+        ...do something with value...
+
+Other single calls ff(x) translate to ffseq([x])[0] with little +penalty. Of course this technique is not always appropriate +and there are other variants, which you can figure out. +

+You can gain some performance by explicitly storing the results of +a function or method lookup into a local variable. A loop like +

+

+    for key in token:
+        dict[key] = dict.get(key, 0) + 1
+
+resolves dict.get every iteration. If the method isn't going to +change, a faster implementation is +

+

+    dict_get = dict.get  # look up the method once
+    for key in token:
+        dict[key] = dict_get(key, 0) + 1
+
+Default arguments can be used to determine values once, at +compile time instead of at run time. This can only be done for +functions or objects which will not be changed during program +execution, such as replacing +

+

+    def degree_sin(deg):
+        return math.sin(deg * math.pi / 180.0)
+
+with +

+

+    def degree_sin(deg, factor = math.pi/180.0, sin = math.sin):
+        return sin(deg * factor)
+
+Because this trick uses default arguments for terms which should +not be changed, it should only be used when you are not concerned +with presenting a possibly confusing API to your users. +

+

+For an anecdote related to optimization, see +

+

+	http://www.python.org/doc/essays/list2str.html
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 01:03:54 2002 by +Neal Norwitz +

+ +


+

4.8. When I have imported a module, then edit it, and import it again (into the same Python process), the changes don't seem to take place. What is going on?

+For reasons of efficiency as well as consistency, Python only reads +the module file on the first time a module is imported. (Otherwise a +program consisting of many modules, each of which imports the same +basic module, would read the basic module over and over again.) To +force rereading of a changed module, do this: +

+

+        import modname
+        reload(modname)
+
+Warning: this technique is not 100% fool-proof. In particular, +modules containing statements like +

+

+        from modname import some_objects
+
+will continue to work with the old version of the imported objects. +

+ +Edit this entry / +Log info +

+ +


+

4.9. How do I find the current module name?

+A module can find out its own module name by looking at the +(predefined) global variable __name__. If this has the value +'__main__' you are running as a script. +

+ +Edit this entry / +Log info +

+ +


+

4.10. I have a module in which I want to execute some extra code when it is run as a script. How do I find out whether I am running as a script?

+See the previous question. E.g. if you put the following on the +last line of your module, main() is called only when your module is +running as a script: +

+

+        if __name__ == '__main__': main()
+
+

+ +Edit this entry / +Log info +

+ +


+

4.11. I try to run a program from the Demo directory but it fails with ImportError: No module named ...; what gives?

+This is probably an optional module (written in C!) which hasn't +been configured on your system. This especially happens with modules +like "Tkinter", "stdwin", "gl", "Xt" or "Xm". For Tkinter, STDWIN and +many other modules, see Modules/Setup.in for info on how to add these +modules to your Python, if it is possible at all. Sometimes you will +have to ftp and build another package first (e.g. Tcl and Tk for Tkinter). +Sometimes the module only works on specific platforms (e.g. gl only works +on SGI machines). +

+NOTE: if the complaint is about "Tkinter" (upper case T) and you have +already configured module "tkinter" (lower case t), the solution is +not to rename tkinter to Tkinter or vice versa. There is probably +something wrong with your module search path. Check out the value of +sys.path. +

+For X-related modules (Xt and Xm) you will have to do more work: they +are currently not part of the standard Python distribution. You will +have to ftp the Extensions tar file, i.e. +ftp://ftp.python.org/pub/python/src/X-extension.tar.gz and follow +the instructions there. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Feb 12 21:31:08 2003 by +Jens Kubieziel +

+ +


+

4.12. [deleted]

+[stdwin (long dead windowing library) entry deleted] +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 08:30:13 2002 by +Erno Kuusela +

+ +


+

4.13. What GUI toolkits exist for Python?

+Depending on what platform(s) you are aiming at, there are several. +

+Currently supported solutions: +

+Cross-platform: +

+Tk: +

+There's a neat object-oriented interface to the Tcl/Tk widget set, +called Tkinter. It is part of the standard Python distribution and +well-supported -- all you need to do is build and install Tcl/Tk and +enable the _tkinter module and the TKPATH definition in Modules/Setup +when building Python. This is probably the easiest to install and +use, and the most complete widget set. It is also very likely that in +the future the standard Python GUI API will be based on or at least +look very much like the Tkinter interface. For more info about Tk, +including pointers to the source, see the Tcl/Tk home page at +http://www.scriptics.com. Tcl/Tk is now fully +portable to the Mac and Windows platforms (NT and 95 only); you need +Python 1.4beta3 or later and Tk 4.1patch1 or later. +

+wxWindows: +

+There's an interface to wxWindows called wxPython. wxWindows is a +portable GUI class library written in C++. It supports GTK, Motif, +MS-Windows and Mac as targets. Ports to other platforms are being +contemplated or have already had some work done on them. wxWindows +preserves the look and feel of the underlying graphics toolkit, and +there is quite a rich widget set and collection of GDI classes. +See the wxWindows page at http://www.wxwindows.org/ for more details. +wxPython is a python extension module that wraps many of the wxWindows +C++ classes, and is quickly gaining popularity amongst Python +developers. You can get wxPython as part of the source or CVS +distribution of wxWindows, or directly from its home page at +http://alldunn.com/wxPython/. +

+Gtk+: +

+PyGtk bindings for the Gtk+ Toolkit by James Henstridge exist; see ftp://ftp.daa.com.au/pub/james/python/. Note that there are two incompatible bindings. If you are using Gtk+ 1.2.x you should get the 0.6.x PyGtk bindings from +

+

+    ftp://ftp.gtk.org/pub/python/v1.2
+
+If you plan to use Gtk+ 2.0 with Python (highly recommended if you are just starting with Gtk), get the most recent distribution from +

+

+    ftp://ftp.gtk.org/pub/python/v2.0
+
+If you are adventurous, you can also check out the source from the Gnome CVS repository. Set your CVS directory to :pserver:anonymous@anoncvs.gnome.org:/cvs/gnome and check the gnome-python module out from the repository. +

+Other: +

+There are also bindings available for the Qt toolkit (PyQt), and for KDE (PyKDE); see http://www.thekompany.com/projects/pykde/. +

+For OpenGL bindings, see http://starship.python.net/~da/PyOpenGL. +

+Platform specific: +

+The Mac port has a rich and ever-growing set of modules that support +the native Mac toolbox calls. See the documentation that comes with +the Mac port. See ftp://ftp.python.org/pub/python/mac. Support +by Jack Jansen jack@cwi.nl. +

+Pythonwin by Mark Hammond (MHammond@skippinet.com.au) +includes an interface to the Microsoft Foundation +Classes and a Python programming environment using it that's written +mostly in Python. See http://www.python.org/windows/. +

+There's an object-oriented GUI based on the Microsoft Foundation +Classes model called WPY, supported by Jim Ahlstrom jim@interet.com. +Programs written in WPY run unchanged and with native look and feel on +Windows NT/95, Windows 3.1 (using win32s), and on Unix (using Tk). +Source and binaries for Windows and Linux are available in +ftp://ftp.python.org/pub/python/wpy/. +

+Obsolete or minority solutions: +

+There's an interface to X11, including the Athena and Motif widget +sets (and a few individual widgets, like Mosaic's HTML widget and +SGI's GL widget) available from +ftp://ftp.python.org/pub/python/src/X-extension.tar.gz. +Support by Sjoerd Mullender sjoerd@cwi.nl. +

+On top of the X11 interface there's the vpApp +toolkit by Per Spilling, now also maintained by Sjoerd Mullender +sjoerd@cwi.nl. See ftp://ftp.cwi.nl/pub/sjoerd/vpApp.tar.gz. +

+For SGI IRIX only, there are unsupported interfaces to the complete +GL (Graphics Library -- low level but very good 3D capabilities) as +well as to FORMS (a buttons-and-sliders-etc package built on top of GL +by Mark Overmars -- ftp'able from +ftp://ftp.cs.ruu.nl/pub/SGI/FORMS/). This is probably also +becoming obsolete, as OpenGL takes over (see above). +

+There's an interface to STDWIN, a platform-independent low-level +windowing interface for Mac and X11. This is totally unsupported and +rapidly becoming obsolete. The STDWIN sources are at +ftp://ftp.cwi.nl/pub/stdwin/. +

+There is an interface to WAFE, a Tcl interface to the X11 +Motif and Athena widget sets. WAFE is at +http://www.wu-wien.ac.at/wafe/wafe.html. +

+ +Edit this entry / +Log info + +/ Last changed on Mon May 13 21:40:39 2002 by +Skip Montanaro +

+ +


+

4.14. Are there any interfaces to database packages in Python?

+Yes! See the Database Topic Guide at +http://www.python.org/topics/database/ for details. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 4 20:12:19 2000 by +Barney Warplug +

+ +


+

4.15. Is it possible to write obfuscated one-liners in Python?

+Yes. See the following three examples, due to Ulf Bartelt: +

+

+        # Primes < 1000
+        print filter(None,map(lambda y:y*reduce(lambda x,y:x*y!=0,
+        map(lambda x,y=y:y%x,range(2,int(pow(y,0.5)+1))),1),range(2,1000)))
+
+
+        # First 10 Fibonacci numbers
+        print map(lambda x,f=lambda x,f:(x<=1) or (f(x-1,f)+f(x-2,f)): f(x,f),
+        range(10))
+
+
+        # Mandelbrot set
+        print (lambda Ru,Ro,Iu,Io,IM,Sx,Sy:reduce(lambda x,y:x+y,map(lambda y,
+        Iu=Iu,Io=Io,Ru=Ru,Ro=Ro,Sy=Sy,L=lambda yc,Iu=Iu,Io=Io,Ru=Ru,Ro=Ro,i=IM,
+        Sx=Sx,Sy=Sy:reduce(lambda x,y:x+y,map(lambda x,xc=Ru,yc=yc,Ru=Ru,Ro=Ro,
+        i=i,Sx=Sx,F=lambda xc,yc,x,y,k,f=lambda xc,yc,x,y,k,f:(k<=0)or (x*x+y*y
+        >=4.0) or 1+f(xc,yc,x*x-y*y+xc,2.0*x*y+yc,k-1,f):f(xc,yc,x,y,k,f):chr(
+        64+F(Ru+x*(Ro-Ru)/Sx,yc,0,0,i)),range(Sx))):L(Iu+y*(Io-Iu)/Sy),range(Sy
+        ))))(-2.1, 0.7, -1.2, 1.2, 30, 80, 24)
+        #    \___ ___/  \___ ___/  |   |   |__ lines on screen
+        #        V          V      |   |______ columns on screen
+        #        |          |      |__________ maximum of "iterations"
+        #        |          |_________________ range on y axis
+        #        |____________________________ range on x axis
+
+Don't try this at home, kids! +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 15:48:33 1997 by +GvR +

+ +


+

4.16. Is there an equivalent of C's "?:" ternary operator?

+Not directly. In many cases you can mimic a?b:c with "a and b or +c", but there's a flaw: if b is zero (or empty, or None -- anything +that tests false) then c will be selected instead. In many cases you +can prove by looking at the code that this can't happen (e.g. because +b is a constant or has a type that can never be false), but in general +this can be a problem. +

+Tim Peters (who wishes it was Steve Majewski) suggested the following +solution: (a and [b] or [c])[0]. Because [b] is a singleton list it +is never false, so the wrong path is never taken; then applying [0] to +the whole thing gets the b or c that you really wanted. Ugly, but it +gets you there in the rare cases where it is really inconvenient to +rewrite your code using 'if'. +

+As a last resort it is possible to implement the "?:" operator as a function: +

+

+    def q(cond,on_true,on_false):
+        from inspect import isfunction
+
+
+        if cond:
+            if not isfunction(on_true): return on_true
+            else: return apply(on_true)
+        else:
+            if not isfunction(on_false): return on_false 
+            else: return apply(on_false)
+
+In most cases you'll pass b and c directly: q(a,b,c). To avoid evaluating b +or c when they shouldn't be, encapsulate them +within a lambda function, e.g.: q(a,lambda: b, lambda: c). +

+

+

+It has been asked why Python has no if-then-else expression, +since most language have one; it is a frequently requested feature. +

+There are several possible answers: just as many languages do +just fine without one; it can easily lead to less readable code; +no sufficiently "Pythonic" syntax has been discovered; a search +of the standard library found remarkably few places where using an +if-then-else expression would make the code more understandable. +

+Nevertheless, in an effort to decide once and for all whether +an if-then-else expression should be added to the language, +PEP 308 (http://www.python.org/peps/pep-0308.html) has been +put forward, proposing a specific syntax. The community can +now vote on this issue. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Feb 7 19:41:13 2003 by +David Goodger +

+ +


+

4.17. My class defines __del__ but it is not called when I delete the object.

+There are several possible reasons for this. +

+The del statement does not necessarily call __del__ -- it simply +decrements the object's reference count, and if this reaches zero +__del__ is called. +

+If your data structures contain circular links (e.g. a tree where +each child has a parent pointer and each parent has a list of +children) the reference counts will never go back to zero. You'll +have to define an explicit close() method which removes those +pointers. Please don't ever call __del__ directly -- __del__ should +call close() and close() should make sure that it can be called more +than once for the same object. +

+If the object has ever been a local variable (or argument, which is +really the same thing) to a function that caught an expression in an +except clause, chances are that a reference to the object still exists +in that function's stack frame as contained in the stack trace. +Normally, deleting (better: assigning None to) sys.exc_traceback will +take care of this. If a stack was printed for an unhandled +exception in an interactive interpreter, delete sys.last_traceback +instead. +

+There is code that deletes all objects when the interpreter exits, +but it is not called if your Python has been configured to support +threads (because other threads may still be active). You can define +your own cleanup function using sys.exitfunc (see question 4.4). +

+Finally, if your __del__ method raises an exception, a warning message is printed to sys.stderr. +

+

+Starting with Python 2.0, a garbage collector periodically reclaims the space used by most cycles with no external references. (See the "gc" module documentation for details.) There are, however, pathological cases where it can be expected to fail. Moreover, the garbage collector runs some time after the last reference to your data structure vanishes, so your __del__ method may be called at an inconvenient and random time. This is inconvenient if you're trying to reproduce a problem. Worse, the order in which object's __del__ methods are executed is arbitrary. +

+Another way to avoid cyclical references is to use the "weakref" module, which allows you to point to objects without incrementing their reference count. Tree data structures, for instance, should use weak references for their parent and sibling pointers (if they need them!). +

+Question 6.14 is intended to explain the new garbage collection algorithm. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 10 15:27:28 2002 by +Matthias Urlichs +

+ +


+

4.18. How do I change the shell environment for programs called using os.popen() or os.system()? Changing os.environ doesn't work.

+You must be using either a version of python before 1.4, or on a +(rare) system that doesn't have the putenv() library function. +

+Before Python 1.4, modifying the environment passed to subshells was +left out of the interpreter because there seemed to be no +well-established portable way to do it (in particular, some systems, +have putenv(), others have setenv(), and some have none at all). As +of Python 1.4, almost all Unix systems do have putenv(), and so does +the Win32 API, and thus the os module was modified so that changes to +os.environ are trapped and the corresponding putenv() call is made. +

+ +Edit this entry / +Log info +

+ +


+

4.19. What is a class?

+A class is the particular object type created by executing +a class statement. Class objects are used as templates, to create +instance objects, which embody both the data structure +(attributes) and program routines (methods) specific to a datatype. +

+A class can be based on one or more other classes, called its base +class(es). It then inherits the attributes and methods of its base classes. This allows an object model to be successively refined +by inheritance. +

+The term "classic class" is used to refer to the original +class implementation in Python. One problem with classic +classes is their inability to use the built-in data types +(such as list and dictionary) as base classes. Starting +with Python 2.2 an attempt is in progress to unify user-defined +classes and built-in types. It is now possible to declare classes +that inherit from built-in types. +

+ +Edit this entry / +Log info + +/ Last changed on Mon May 27 01:31:21 2002 by +Steve Holden +

+ +


+

4.20. What is a method?

+A method is a function that you normally call as +x.name(arguments...) for some object x. The term is used for methods +of classes and class instances as well as for methods of built-in +objects. (The latter have a completely different implementation and +only share the way their calls look in Python code.) Methods of +classes (and class instances) are defined as functions inside the +class definition. +

+ +Edit this entry / +Log info +

+ +


+

4.21. What is self?

+Self is merely a conventional name for the first argument of a +method -- i.e. a function defined inside a class definition. A method +defined as meth(self, a, b, c) should be called as x.meth(a, b, c) for +some instance x of the class in which the definition occurs; +the called method will think it is called as meth(x, a, b, c). +

+ +Edit this entry / +Log info +

+ +


+

4.22. What is an unbound method?

+An unbound method is a method defined in a class that is not yet +bound to an instance. You get an unbound method if you ask for a +class attribute that happens to be a function. You get a bound method +if you ask for an instance attribute. A bound method knows which +instance it belongs to and calling it supplies the instance automatically; +an unbound method only knows which class it wants for its first +argument (a derived class is also OK). Calling an unbound method +doesn't "magically" derive the first argument from the context -- you +have to provide it explicitly. +

+Trivia note regarding bound methods: each reference to a bound +method of a particular object creates a bound method object. If you +have two such references (a = inst.meth; b = inst.meth), they will +compare equal (a == b) but are not the same (a is not b). +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 6 18:07:25 1998 by +Clarence Gardner +

+ +


+

4.23. How do I call a method defined in a base class from a derived class that overrides it?

+If your class definition starts with "class Derived(Base): ..." +then you can call method meth defined in Base (or one of Base's base +classes) as Base.meth(self, arguments...). Here, Base.meth is an +unbound method (see previous question). +

+ +Edit this entry / +Log info +

+ +


+

4.24. How do I call a method from a base class without using the name of the base class?

+DON'T DO THIS. REALLY. I MEAN IT. It appears that you could call +self.__class__.__bases__[0].meth(self, arguments...) but this fails when +a doubly-derived method is derived from your class: for its instances, +self.__class__.__bases__[0] is your class, not its base class -- so +(assuming you are doing this from within Derived.meth) you would start +a recursive call. +

+Often when you want to do this you are forgetting that classes +are first class in Python. You can "point to" the class you want +to delegate an operation to either at the instance or at the +subclass level. For example if you want to use a "glorp" +operation of a superclass you can point to the right superclass +to use. +

+

+  class subclass(superclass1, superclass2, superclass3):
+      delegate_glorp = superclass2
+      ...
+      def glorp(self, arg1, arg2):
+            ... subclass specific stuff ...
+            self.delegate_glorp.glorp(self, arg1, arg2)
+       ...
+
+
+  class subsubclass(subclass):
+       delegate_glorp = superclass3
+       ...
+
+Note, however that setting delegate_glorp to subclass in +subsubclass would cause an infinite recursion on subclass.delegate_glorp. Careful! Maybe you are getting too fancy for your own good. Consider simplifying the design (?). +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jul 28 13:58:22 1997 by +aaron watters +

+ +


+

4.25. How can I organize my code to make it easier to change the base class?

+You could define an alias for the base class, assign the real base +class to it before your class definition, and use the alias throughout +your class. Then all you have to change is the value assigned to the +alias. Incidentally, this trick is also handy if you want to decide +dynamically (e.g. depending on availability of resources) which base +class to use. Example: +

+

+        BaseAlias = <real base class>
+        class Derived(BaseAlias):
+                def meth(self):
+                        BaseAlias.meth(self)
+                        ...
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 15:49:57 1997 by +GvR +

+ +


+

4.26. How can I find the methods or attributes of an object?

+This depends on the object type. +

+For an instance x of a user-defined class, instance attributes are +found in the dictionary x.__dict__, and methods and attributes defined +by its class are found in x.__class__.__bases__[i].__dict__ (for i in +range(len(x.__class__.__bases__))). You'll have to walk the tree of +base classes to find all class methods and attributes. +

+Many, but not all built-in types define a list of their method names +in x.__methods__, and if they have data attributes, their names may be +found in x.__members__. However this is only a convention. +

+For more information, read the source of the standard (but +undocumented) module newdir. +

+ +Edit this entry / +Log info +

+ +


+

4.27. I can't seem to use os.read() on a pipe created with os.popen().

+os.read() is a low-level function which takes a file descriptor (a +small integer). os.popen() creates a high-level file object -- the +same type used for sys.std{in,out,err} and returned by the builtin +open() function. Thus, to read n bytes from a pipe p created with +os.popen(), you need to use p.read(n). +

+ +Edit this entry / +Log info +

+ +


+

4.28. How can I create a stand-alone binary from a Python script?

+Even though there are Python compilers being developed, +you probably don't need a real compiler, if all you want +is a stand-alone program. There are three solutions to that. +

+One is to use the freeze tool, which is included in the Python +source tree as Tools/freeze. It converts Python byte +code to C arrays. Using a C compiler, you can embed all +your modules into a new program, which is then linked +with the standard Python modules. +

+It works by scanning your source recursively for import statements +(in both forms) and looking for the modules in the standard Python path +as well as in the source directory (for built-in modules). It then +1 the modules written in Python to C code (array initializers +that can be turned into code objects using the marshal module) and +creates a custom-made config file that only contains those built-in +modules which are actually used in the program. It then compiles the +generated C code and links it with the rest of the Python interpreter +to form a self-contained binary which acts exactly like your script. +

+(Hint: the freeze program only works if your script's filename ends in +".py".) +

+There are several utilities which may be helpful. The first is Gordon McMillan's installer at +

+

+    http://www.mcmillan-inc.com/install1.html
+
+which works on Windows, Linux and at least some forms of Unix. +

+Another is Thomas Heller's py2exe (Windows only) at +

+

+    http://starship.python.net/crew/theller/py2exe/
+
+A third is Christian Tismer's SQFREEZE +(http://starship.python.net/crew/pirx/) which appends the byte code +to a specially-prepared Python interpreter, which +will find the byte code in executable. +

+A fourth is Fredrik Lundh's Squeeze +(http://www.pythonware.com/products/python/squeeze/). +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jun 19 14:01:30 2002 by +Gordon McMillan +

+ +


+

4.29. What WWW tools are there for Python?

+See the chapters titled "Internet Protocols and Support" and +"Internet Data Handling" in the Library Reference +Manual. Python is full of good things which will help you build server-side and client-side web systems. +

+A summary of available frameworks is maintained by Paul Boddie at +

+

+    http://thor.prohosting.com/~pboddie/Python/web_modules.html
+
+Cameron Laird maintains a useful set of pages about Python web technologies at +

+

+   http://starbase.neosoft.com/~claird/comp.lang.python/web_python.html/
+
+There was a web browser written in Python, called Grail -- +see http://sourceforge.net/project/grail/. This project has been terminated; http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/grail/grail/README gives more details. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Nov 11 22:48:25 2002 by +GvR +

+ +


+

4.30. How do I run a subprocess with pipes connected to both input and output?

+Use the standard popen2 module. For example: +

+

+	import popen2
+	fromchild, tochild = popen2.popen2("command")
+	tochild.write("input\n")
+	tochild.flush()
+	output = fromchild.readline()
+
+Warning: in general, it is unwise to +do this, because you can easily cause a deadlock where your +process is blocked waiting for output from the child, while the child +is blocked waiting for input from you. This can be caused +because the parent expects the child to output more text than it does, +or it can be caused by data being stuck in stdio buffers due to lack +of flushing. The Python parent can of course explicitly flush the data +it sends to the child before it reads any output, but if the child is +a naive C program it can easily have been written to never explicitly +flush its output, even if it is interactive, since flushing is +normally automatic. +

+Note that a deadlock is also possible if you use popen3 to read +stdout and stderr. If one of the two is too large for the internal +buffer (increasing the buffersize does not help) and you read() +the other one first, there is a deadlock, too. +

+Note on a bug in popen2: unless your program calls wait() +or waitpid(), finished child processes are never removed, +and eventually calls to popen2 will fail because of a limit on +the number of child processes. Calling os.waitpid with the +os.WNOHANG option can prevent this; a good place to insert such +a call would be before calling popen2 again. +

+Another way to produce a deadlock: Call a wait() and there is +still more output from the program than what fits into the +internal buffers. +

+In many cases, all you really need is to run some data through a +command and get the result back. Unless the data is infinite in size, +the easiest (and often the most efficient!) way to do this is to write +it to a temporary file and run the command with that temporary file as +input. The standard module tempfile exports a function mktemp() which +generates unique temporary file names. +

+

+ import tempfile
+ import os
+ class Popen3:
+    """
+    This is a deadlock-save version of popen, that returns
+    an object with errorlevel, out (a string) and err (a string).
+    (capturestderr may not work under windows.)
+    Example: print Popen3('grep spam','\n\nhere spam\n\n').out
+    """
+    def __init__(self,command,input=None,capturestderr=None):
+        outfile=tempfile.mktemp()
+        command="( %s ) > %s" % (command,outfile)
+        if input:
+            infile=tempfile.mktemp()
+            open(infile,"w").write(input)
+            command=command+" <"+infile
+        if capturestderr:
+            errfile=tempfile.mktemp()
+            command=command+" 2>"+errfile
+        self.errorlevel=os.system(command) >> 8
+        self.out=open(outfile,"r").read()
+        os.remove(outfile)
+        if input:
+            os.remove(infile)
+        if capturestderr:
+            self.err=open(errfile,"r").read()
+            os.remove(errfile)
+
+Note that many interactive programs (e.g. vi) don't work well with +pipes substituted for standard input and output. You will have to use +pseudo ttys ("ptys") instead of pipes. There is some undocumented +code to use these in the library module pty.py -- I'm afraid you're on +your own here. +

+A different answer is a Python interface to Don Libes' "expect" +library. A Python extension that interfaces to expect is called "expy" +and available from +http://expectpy.sourceforge.net/. +

+A pure Python solution that works like expect is pexpect of Noah Spurrier. +A beta version is available from +http://pexpect.sourceforge.net/ +

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 3 16:31:31 2002 by +Tobias Polzin +

+ +


+

4.31. How do I call a function if I have the arguments in a tuple?

+Use the built-in function apply(). For instance, +

+

+    func(1, 2, 3)
+
+is equivalent to +

+

+    args = (1, 2, 3)
+    apply(func, args)
+
+Note that func(args) is not the same -- it calls func() with exactly +one argument, the tuple args, instead of three arguments, the integers +1, 2 and 3. +

+In Python 2.0, you can also use extended call syntax: +

+f(*args) is equivalent to apply(f, args) +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 03:42:50 2001 by +Moshe Zadka +

+ +


+

4.32. How do I enable font-lock-mode for Python in Emacs?

+If you are using XEmacs 19.14 or later, any XEmacs 20, FSF Emacs 19.34 +or any Emacs 20, font-lock should work automatically for you if you +are using the latest python-mode.el. +

+If you are using an older version of XEmacs or Emacs you will need +to put this in your .emacs file: +

+

+        (defun my-python-mode-hook ()
+          (setq font-lock-keywords python-font-lock-keywords)
+          (font-lock-mode 1))
+        (add-hook 'python-mode-hook 'my-python-mode-hook)
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Apr 6 16:18:46 1998 by +Barry Warsaw +

+ +


+

4.33. Is there a scanf() or sscanf() equivalent?

+Not as such. +

+For simple input parsing, the easiest approach is usually to split +the line into whitespace-delimited words using string.split(), and to +convert decimal strings to numeric values using int(), +long() or float(). (Python's int() is 32-bit and its +long() is arbitrary precision.) string.split supports an optional +"sep" parameter which is useful if the line uses something other +than whitespace as a delimiter. +

+For more complicated input parsing, regular expressions (see module re) +are better suited and more powerful than C's sscanf(). +

+There's a contributed module that emulates sscanf(), by Steve Clift; +see contrib/Misc/sscanfmodule.c of the ftp site: +

+

+    http://www.python.org/ftp/python/contrib-09-Dec-1999/Misc/
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 01:07:51 2002 by +Neal Norwitz +

+ +


+

4.34. Can I have Tk events handled while waiting for I/O?

+Yes, and you don't even need threads! But you'll have to +restructure your I/O code a bit. Tk has the equivalent of Xt's +XtAddInput() call, which allows you to register a callback function +which will be called from the Tk mainloop when I/O is possible on a +file descriptor. Here's what you need: +

+

+        from Tkinter import tkinter
+        tkinter.createfilehandler(file, mask, callback)
+
+The file may be a Python file or socket object (actually, anything +with a fileno() method), or an integer file descriptor. The mask is +one of the constants tkinter.READABLE or tkinter.WRITABLE. The +callback is called as follows: +

+

+        callback(file, mask)
+
+You must unregister the callback when you're done, using +

+

+        tkinter.deletefilehandler(file)
+
+Note: since you don't know *how many bytes* are available for reading, +you can't use the Python file object's read or readline methods, since +these will insist on reading a predefined number of bytes. For +sockets, the recv() or recvfrom() methods will work fine; for other +files, use os.read(file.fileno(), maxbytecount). +

+ +Edit this entry / +Log info +

+ +


+

4.35. How do I write a function with output parameters (call by reference)?

+[Mark Lutz] The thing to remember is that arguments are passed by +assignment in Python. Since assignment just creates references to +objects, there's no alias between an argument name in the caller and +callee, and so no call-by-reference per se. But you can simulate it +in a number of ways: +

+1) By using global variables; but you probably shouldn't :-) +

+2) By passing a mutable (changeable in-place) object: +

+

+      def func1(a):
+          a[0] = 'new-value'     # 'a' references a mutable list
+          a[1] = a[1] + 1        # changes a shared object
+
+
+      args = ['old-value', 99]
+      func1(args)
+      print args[0], args[1]     # output: new-value 100
+
+3) By returning a tuple, holding the final values of arguments: +

+

+      def func2(a, b):
+          a = 'new-value'        # a and b are local names
+          b = b + 1              # assigned to new objects
+          return a, b            # return new values
+
+
+      x, y = 'old-value', 99
+      x, y = func2(x, y)
+      print x, y                 # output: new-value 100
+
+4) And other ideas that fall-out from Python's object model. For instance, it might be clearer to pass in a mutable dictionary: +

+

+      def func3(args):
+          args['a'] = 'new-value'     # args is a mutable dictionary
+          args['b'] = args['b'] + 1   # change it in-place
+
+
+      args = {'a':' old-value', 'b': 99}
+      func3(args)
+      print args['a'], args['b']
+
+5) Or bundle-up values in a class instance: +

+

+      class callByRef:
+          def __init__(self, **args):
+              for (key, value) in args.items():
+                  setattr(self, key, value)
+
+
+      def func4(args):
+          args.a = 'new-value'        # args is a mutable callByRef
+          args.b = args.b + 1         # change object in-place
+
+
+      args = callByRef(a='old-value', b=99)
+      func4(args)
+      print args.a, args.b
+
+
+   But there's probably no good reason to get this complicated :-).
+
+[Python's author favors solution 3 in most cases.] +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 8 23:49:46 1997 by +David Ascher +

+ +


+

4.36. Please explain the rules for local and global variables in Python.

+[Ken Manheimer] In Python, procedure variables are implicitly +global, unless they are assigned anywhere within the block. +In that case +they are implicitly local, and you need to explicitly declare them as +'global'. +

+Though a bit surprising at first, a moment's consideration explains +this. On one hand, requirement of 'global' for assigned vars provides +a bar against unintended side-effects. On the other hand, if global +were required for all global references, you'd be using global all the +time. Eg, you'd have to declare as global every reference to a +builtin function, or to a component of an imported module. This +clutter would defeat the usefulness of the 'global' declaration for +identifying side-effects. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Aug 28 09:53:27 1998 by +GvR +

+ +


+

4.37. How can I have modules that mutually import each other?

+Suppose you have the following modules: +

+foo.py: +

+

+	from bar import bar_var
+	foo_var=1
+
+bar.py: +

+

+	from foo import foo_var
+	bar_var=2
+
+The problem is that the above is processed by the interpreter thus: +

+

+	main imports foo
+	Empty globals for foo are created
+	foo is compiled and starts executing
+	foo imports bar
+	Empty globals for bar are created
+	bar is compiled and starts executing
+	bar imports foo (which is a no-op since there already is a module named foo)
+	bar.foo_var = foo.foo_var
+	...
+
+The last step fails, because Python isn't done with interpreting foo yet and the global symbol dict for foo is still empty. +

+The same thing happens when you use "import foo", and then try to access "foo.one" in global code. +

+

+There are (at least) three possible workarounds for this problem. +

+Guido van Rossum recommends to avoid all uses of "from <module> import ..." (so everything from an imported module is referenced as <module>.<name>) and to place all code inside functions. Initializations of global variables and class variables should use constants or built-in functions only. +

+

+Jim Roskind suggests the following order in each module: +

+

+ exports (globals, functions, and classes that don't need imported base classes)
+ import statements
+ active code (including globals that are initialized from imported values).
+
+Python's author doesn't like this approach much because the imports +appear in a strange place, but has to admit that it works. +

+

+

+Matthias Urlichs recommends to restructure your code so that the recursive import is not necessary in the first place. +

+

+These solutions are not mutually exclusive. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 06:52:51 2002 by +Matthias Urlichs +

+ +


+

4.38. How do I copy an object in Python?

+Try copy.copy() or copy.deepcopy() for the general case. Not all objects can be copied, but most can. +

+Dictionaries have a copy method. Sequences can be copied by slicing: +

+ new_l = l[:]
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 05:40:26 2002 by +Erno Kuusela +

+ +


+

4.39. How to implement persistent objects in Python? (Persistent == automatically saved to and restored from disk.)

+The library module "pickle" now solves this in a very general way +(though you still can't store things like open files, sockets or +windows), and the library module "shelve" uses pickle and (g)dbm to +create persistent mappings containing arbitrary Python objects. +For possibly better performance also look for the latest version +of the relatively recent cPickle module. +

+A more awkward way of doing things is to use pickle's little sister, +marshal. The marshal module provides very fast ways to store +noncircular basic Python types to files and strings, and back again. +Although marshal does not do fancy things like store instances or +handle shared references properly, it does run extremely fast. For +example loading a half megabyte of data may take less than a +third of a second (on some machines). This often beats doing +something more complex and general such as using gdbm with +pickle/shelve. +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 8 22:59:00 1997 by +David Ascher +

+ +


+

4.40. I try to use __spam and I get an error about _SomeClassName__spam.

+Variables with double leading underscore are "mangled" to provide a +simple but effective way to define class private variables. See the +chapter "New in Release 1.4" in the Python Tutorial. +

+ +Edit this entry / +Log info +

+ +


+

4.41. How do I delete a file? And other file questions.

+Use os.remove(filename) or os.unlink(filename); for documentation, +see the posix section of the library manual. They are the same, +unlink() is simply the Unix name for this function. In earlier +versions of Python, only os.unlink() was available. +

+To remove a directory, use os.rmdir(); use os.mkdir() to create one. +

+To rename a file, use os.rename(). +

+To truncate a file, open it using f = open(filename, "r+"), and use +f.truncate(offset); offset defaults to the current seek position. +(The "r+" mode opens the file for reading and writing.) +There's also os.ftruncate(fd, offset) for files opened with os.open() +-- for advanced Unix hacks only. +

+The shutil module also contains a number of functions to work on files +including copyfile, copytree, and rmtree amongst others. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 12:30:01 2000 by +Bjorn Pettersen +

+ +


+

4.42. How to modify urllib or httplib to support HTTP/1.1?

+Recent versions of Python (2.0 and onwards) support HTTP/1.1 natively. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 02:56:56 2001 by +Moshe Zadka +

+ +


+

4.43. Unexplicable syntax errors in compile() or exec.

+When a statement suite (as opposed to an expression) is compiled by +compile(), exec or execfile(), it must end in a newline. In some +cases, when the source ends in an indented block it appears that at +least two newlines are required. +

+ +Edit this entry / +Log info +

+ +


+

4.44. How do I convert a string to a number?

+For integers, use the built-in int() function, e.g. int('144') == 144. Similarly, long() converts from string to long integer, e.g. long('144') == 144L; and float() to floating-point, e.g. float('144') == 144.0. +

+Note that these are restricted to decimal interpretation, so +that int('0144') == 144 and int('0x144') raises ValueError. For Python +2.0 int takes the base to convert from as a second optional argument, so +int('0x144', 16) == 324. +

+For greater flexibility, or before Python 1.5, import the module +string and use the string.atoi() function for integers, +string.atol() for long integers, or string.atof() for +floating-point. E.g., +string.atoi('100', 16) == string.atoi('0x100', 0) == 256. +See the library reference manual section for the string module for +more details. +

+While you could use the built-in function eval() instead of +any of those, this is not recommended, because someone could pass you +a Python expression that might have unwanted side effects (like +reformatting your disk). It also has the effect of interpreting numbers +as Python expressions, so that e.g. eval('09') gives a syntax error +since Python regards numbers starting with '0' as octal (base 8). +

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 12:37:34 2000 by +Bjorn Pettersen +

+ +


+

4.45. How do I convert a number to a string?

+To convert, e.g., the number 144 to the string '144', use the +built-in function repr() or the backquote notation (these are +equivalent). If you want a hexadecimal or octal representation, use +the built-in functions hex() or oct(), respectively. For fancy +formatting, use the % operator on strings, just like C printf formats, +e.g. "%04d" % 144 yields '0144' and "%.3f" % (1/3.0) yields '0.333'. +See the library reference manual for details. +

+ +Edit this entry / +Log info +

+ +


+

4.46. How do I copy a file?

+There's the shutil module which contains a copyfile() +function that implements a copy loop; +it isn't good enough for the Macintosh, though: +it doesn't copy the resource fork and Finder info. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 02:59:40 2001 by +Moshe Zadka +

+ +


+

4.47. How do I check if an object is an instance of a given class or of a subclass of it?

+If you are developing the classes from scratch it might be better to +program in a more proper object-oriented style -- instead of doing a different +thing based on class membership, why not use a method and define the +method differently in different classes? +

+However, there are some legitimate situations +where you need to test for class membership. +

+In Python 1.5, you can use the built-in function isinstance(obj, cls). +

+The following approaches can be used with earlier Python versions: +

+An unobvious method is to raise the object +as an exception and to try to catch the exception with the class you're +testing for: +

+

+	def is_instance_of(the_instance, the_class):
+	    try:
+		raise the_instance
+	    except the_class:
+		return 1
+	    except:
+		return 0
+
+This technique can be used to distinguish "subclassness" +from a collection of classes as well +

+

+                try:
+                              raise the_instance
+                except Audible:
+                              the_instance.play(largo)
+                except Visual:
+                              the_instance.display(gaudy)
+                except Olfactory:
+                              sniff(the_instance)
+                except:
+                              raise ValueError, "dunno what to do with this!"
+
+This uses the fact that exception catching tests for class or subclass +membership. +

+A different approach is to test for the presence of a class attribute that +is presumably unique for the given class. For instance: +

+

+	class MyClass:
+	    ThisIsMyClass = 1
+	    ...
+
+
+	def is_a_MyClass(the_instance):
+	    return hasattr(the_instance, 'ThisIsMyClass')
+
+This version is easier to inline, and probably faster (inlined it +is definitely faster). The disadvantage is that someone else could cheat: +

+

+	class IntruderClass:
+	    ThisIsMyClass = 1    # Masquerade as MyClass
+	    ...
+
+but this may be seen as a feature (anyway, there are plenty of other ways +to cheat in Python). Another disadvantage is that the class must be +prepared for the membership test. If you do not "control the +source code" for the class it may not be advisable to modify the +class to support testability. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jan 2 15:16:04 1998 by +GvR +

+ +


+

4.48. What is delegation?

+Delegation refers to an object oriented technique Python programmers +may implement with particular ease. Consider the following: +

+

+  from string import upper
+
+
+  class UpperOut:
+        def __init__(self, outfile):
+              self.__outfile = outfile
+        def write(self, str):
+              self.__outfile.write( upper(str) )
+        def __getattr__(self, name):
+              return getattr(self.__outfile, name)
+
+Here the UpperOut class redefines the write method +to convert the argument string to upper case before +calling the underlying self.__outfile.write method, but +all other methods are delegated to the underlying +self.__outfile object. The delegation is accomplished +via the "magic" __getattr__ method. Please see the +language reference for more information on the use +of this method. +

+Note that for more general cases delegation can +get trickier. Particularly when attributes must be set +as well as gotten the class must define a __settattr__ +method too, and it must do so carefully. +

+The basic implementation of __setattr__ is roughly +equivalent to the following: +

+

+   class X:
+        ...
+        def __setattr__(self, name, value):
+             self.__dict__[name] = value
+        ...
+
+Most __setattr__ implementations must modify +self.__dict__ to store local state for self without +causing an infinite recursion. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 13 07:11:24 1997 by +aaron watters +

+ +


+

4.49. How do I test a Python program or component.

+We presume for the purposes of this question you are interested +in standalone testing, rather than testing your components inside +a testing framework. The best-known testing framework for Python +is the PyUnit module, maintained at +

+

+    http://pyunit.sourceforge.net/
+
+For standalone testing, it helps to write the program so that +it may be easily tested by using good modular design. +In particular your program +should have almost all functionality encapsulated in either functions +or class methods -- and this sometimes has the surprising and +delightful effect of making the program run faster (because +local variable accesses are faster than global accesses). +Furthermore the program should avoid depending on mutating +global variables, since this makes testing much more difficult to do. +

+The "global main logic" of your program may be as simple +as +

+

+  if __name__=="__main__":
+       main_logic()
+
+at the bottom of the main module of your program. +

+Once your program is organized as a tractable collection +of functions and class behaviours you should write test +functions that exercise the behaviours. A test suite +can be associated with each module which automates +a sequence of tests. This sounds like a lot of work, but +since Python is so terse and flexible it's surprisingly easy. +You can make coding much more pleasant and fun by +writing your test functions in parallel with the "production +code", since this makes it easy to find bugs and even +design flaws earlier. +

+"Support modules" that are not intended to be the main +module of a program may include a "test script interpretation" +which invokes a self test of the module. +

+

+   if __name__ == "__main__":
+      self_test()
+
+Even programs that interact with complex external +interfaces may be tested when the external interfaces are +unavailable by using "fake" interfaces implemented in +Python. For an example of a "fake" interface, the following +class defines (part of) a "fake" file interface: +

+

+ import string
+ testdata = "just a random sequence of characters"
+
+
+ class FakeInputFile:
+   data = testdata
+   position = 0
+   closed = 0
+
+
+   def read(self, n=None):
+       self.testclosed()
+       p = self.position
+       if n is None:
+          result= self.data[p:]
+       else:
+          result= self.data[p: p+n]
+       self.position = p + len(result)
+       return result
+
+
+   def seek(self, n, m=0):
+       self.testclosed()
+       last = len(self.data)
+       p = self.position
+       if m==0: 
+          final=n
+       elif m==1:
+          final=n+p
+       elif m==2:
+          final=len(self.data)+n
+       else:
+          raise ValueError, "bad m"
+       if final<0:
+          raise IOError, "negative seek"
+       self.position = final
+
+
+   def isatty(self):
+       return 0
+
+
+   def tell(self):
+       return self.position
+
+
+   def close(self):
+       self.closed = 1
+
+
+   def testclosed(self):
+       if self.closed:
+          raise IOError, "file closed"
+
+Try f=FakeInputFile() and test out its operations. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 01:12:10 2002 by +Neal Norwitz +

+ +


+

4.50. My multidimensional list (array) is broken! What gives?

+You probably tried to make a multidimensional array like this. +

+

+   A = [[None] * 2] * 3
+
+This makes a list containing 3 references to the same list of length +two. Changes to one row will show in all rows, which is probably not +what you want. The following works much better: +

+

+   A = [None]*3
+   for i in range(3):
+        A[i] = [None] * 2
+
+This generates a list containing 3 different lists of length two. +

+If you feel weird, you can also do it in the following way: +

+

+   w, h = 2, 3
+   A = map(lambda i,w=w: [None] * w, range(h))
+
+For Python 2.0 the above can be spelled using a list comprehension: +

+

+   w,h = 2,3
+   A = [ [None]*w for i in range(h) ]
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 12:18:35 2000 by +Bjorn Pettersen +

+ +


+

4.51. I want to do a complicated sort: can you do a Schwartzian Transform in Python?

+Yes, and in Python you only have to write it once: +

+

+ def st(List, Metric):
+     def pairing(element, M = Metric):
+           return (M(element), element)
+     paired = map(pairing, List)
+     paired.sort()
+     return map(stripit, paired)
+
+
+ def stripit(pair):
+     return pair[1]
+
+This technique, attributed to Randal Schwartz, sorts the elements +of a list by a metric which maps each element to its "sort value". +For example, if L is a list of string then +

+

+   import string
+   Usorted = st(L, string.upper)
+
+
+   def intfield(s):
+         return string.atoi( string.strip(s[10:15] ) )
+
+
+   Isorted = st(L, intfield)
+
+Usorted gives the elements of L sorted as if they were upper +case, and Isorted gives the elements of L sorted by the integer +values that appear in the string slices starting at position 10 +and ending at position 15. In Python 2.0 this can be done more +naturally with list comprehensions: +

+

+  tmp1 = [ (x.upper(), x) for x in L ] # Schwartzian transform
+  tmp1.sort()
+  Usorted = [ x[1] for x in tmp1 ]
+
+
+  tmp2 = [ (int(s[10:15]), s) for s in L ] # Schwartzian transform
+  tmp2.sort()
+  Isorted = [ x[1] for x in tmp2 ]
+
+

+Note that Isorted may also be computed by +

+

+   def Icmp(s1, s2):
+         return cmp( intfield(s1), intfield(s2) )
+
+
+   Isorted = L[:]
+   Isorted.sort(Icmp)
+
+but since this method computes intfield many times for each +element of L, it is slower than the Schwartzian Transform. +

+ +Edit this entry / +Log info + +/ Last changed on Sat Jun 1 19:18:46 2002 by +Neal Norwitz +

+ +


+

4.52. How to convert between tuples and lists?

+The function tuple(seq) converts any sequence into a tuple with +the same items in the same order. +For example, tuple([1, 2, 3]) yields (1, 2, 3) and tuple('abc') +yields ('a', 'b', 'c'). If the argument is +a tuple, it does not make a copy but returns the same object, so +it is cheap to call tuple() when you aren't sure that an object +is already a tuple. +

+The function list(seq) converts any sequence into a list with +the same items in the same order. +For example, list((1, 2, 3)) yields [1, 2, 3] and list('abc') +yields ['a', 'b', 'c']. If the argument is a list, +it makes a copy just like seq[:] would. +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 14 14:18:53 1998 by +Tim Peters +

+ +


+

4.53. Files retrieved with urllib contain leading garbage that looks like email headers.

+Extremely old versions of Python supplied libraries which +did not support HTTP/1.1; the vanilla httplib in Python 1.4 +only recognized HTTP/1.0. In Python 2.0 full HTTP/1.1 support is included. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jan 8 17:26:18 2001 by +Steve Holden +

+ +


+

4.54. How do I get a list of all instances of a given class?

+Python does not keep track of all instances of a class (or of a +built-in type). +

+You can program the class's constructor to keep track of all +instances, but unless you're very clever, this has the disadvantage +that the instances never get deleted,because your list of all +instances keeps a reference to them. +

+(The trick is to regularly inspect the reference counts of the +instances you've retained, and if the reference count is below a +certain level, remove it from the list. Determining that level is +tricky -- it's definitely larger than 1.) +

+ +Edit this entry / +Log info + +/ Last changed on Tue May 27 23:52:16 1997 by +GvR +

+ +


+

4.55. A regular expression fails with regex.error: match failure.

+This is usually caused by too much backtracking; the regular +expression engine has a fixed size stack which holds at most 4000 +backtrack points. Every character matched by e.g. ".*" accounts for a +backtrack point, so even a simple search like +

+

+  regex.match('.*x',"x"*5000)
+
+will fail. +

+This is fixed in the re module introduced with +Python 1.5; consult the Library Reference section on re for more information. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jul 30 12:35:49 1998 by +A.M. Kuchling +

+ +


+

4.56. I can't get signal handlers to work.

+The most common problem is that the signal handler is declared +with the wrong argument list. It is called as +

+

+	handler(signum, frame)
+
+so it should be declared with two arguments: +

+

+	def handler(signum, frame):
+		...
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed May 28 09:29:08 1997 by +GvR +

+ +


+

4.57. I can't use a global variable in a function? Help!

+Did you do something like this? +

+

+   x = 1 # make a global
+
+
+   def f():
+         print x # try to print the global
+         ...
+         for j in range(100):
+              if q>3:
+                 x=4
+
+Any variable assigned in a function is local to that function. +unless it is specifically declared global. Since a value is bound +to x as the last statement of the function body, the compiler +assumes that x is local. Consequently the "print x" +attempts to print an uninitialized local variable and will +trigger a NameError. +

+In such cases the solution is to insert an explicit global +declaration at the start of the function, making it +

+

+

+   def f():
+         global x
+         print x # try to print the global
+         ...
+         for j in range(100):
+              if q>3:
+                 x=4
+
+

+In this case, all references to x are interpreted as references +to the x from the module namespace. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Feb 12 15:52:12 2001 by +Steve Holden +

+ +


+

4.58. What's a negative index? Why doesn't list.insert() use them?

+Python sequences are indexed with positive numbers and +negative numbers. For positive numbers 0 is the first index +1 is the second index and so forth. For negative indices -1 +is the last index and -2 is the pentultimate (next to last) index +and so forth. Think of seq[-n] as the same as seq[len(seq)-n]. +

+Using negative indices can be very convenient. For example +if the string Line ends in a newline then Line[:-1] is all of Line except +the newline. +

+Sadly the list builtin method L.insert does not observe negative +indices. This feature could be considered a mistake but since +existing programs depend on this feature it may stay around +forever. L.insert for negative indices inserts at the start of the +list. To get "proper" negative index behaviour use L[n:n] = [x] +in place of the insert method. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 13 07:03:18 1997 by +aaron watters +

+ +


+

4.59. How can I sort one list by values from another list?

+You can sort lists of tuples. +

+

+  >>> list1 = ["what", "I'm", "sorting", "by"]
+  >>> list2 = ["something", "else", "to", "sort"]
+  >>> pairs = map(None, list1, list2)
+  >>> pairs
+  [('what', 'something'), ("I'm", 'else'), ('sorting', 'to'), ('by', 'sort')]
+  >>> pairs.sort()
+  >>> pairs
+  [("I'm", 'else'), ('by', 'sort'), ('sorting', 'to'), ('what', 'something')]
+  >>> result = pairs[:]
+  >>> for i in xrange(len(result)): result[i] = result[i][1]
+  ...
+  >>> result
+  ['else', 'sort', 'to', 'something']
+
+And if you didn't understand the question, please see the +example above ;c). Note that "I'm" sorts before "by" because +uppercase "I" comes before lowercase "b" in the ascii order. +Also see 4.51. +

+In Python 2.0 this can be done like: +

+

+ >>> list1 = ["what", "I'm", "sorting", "by"]
+ >>> list2 = ["something", "else", "to", "sort"]
+ >>> pairs = zip(list1, list2)
+ >>> pairs
+ [('what', 'something'), ("I'm", 'else'), ('sorting', 'to'), ('by', 'sort')]
+ >>> pairs.sort()
+ >>> result = [ x[1] for x in pairs ]
+ >>> result
+ ['else', 'sort', 'to', 'something']
+
+[Followup] +

+Someone asked, why not this for the last steps: +

+

+  result = []
+  for p in pairs: result.append(p[1])
+
+This is much more legible. However, a quick test shows that +it is almost twice as slow for long lists. Why? First of all, +the append() operation has to reallocate memory, and while it +uses some tricks to avoid doing that each time, it still has +to do it occasionally, and apparently that costs quite a bit. +Second, the expression "result.append" requires an extra +attribute lookup. The attribute lookup could be done away +with by rewriting as follows: +

+

+  result = []
+  append = result.append
+  for p in pairs: append(p[1])
+
+which gains back some speed, but is still considerably slower +than the original solution, and hardly less convoluted. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 12:56:35 2000 by +Bjorn Pettersen +

+ +


+

4.60. Why doesn't dir() work on builtin types like files and lists?

+It does starting with Python 1.5. +

+Using 1.4, you can find out which methods a given object supports +by looking at its __methods__ attribute: +

+

+    >>> List = []
+    >>> List.__methods__
+    ['append', 'count', 'index', 'insert', 'remove', 'reverse', 'sort']
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Sep 16 14:56:42 1999 by +Skip Montanaro +

+ +


+

4.61. How can I mimic CGI form submission (METHOD=POST)?

+I would like to retrieve web pages that are the result of POSTing a +form. Is there existing code that would let me do this easily? +

+Yes. Here's a simple example that uses httplib. +

+

+    #!/usr/local/bin/python
+
+
+    import httplib, sys, time
+
+
+    ### build the query string
+    qs = "First=Josephine&MI=Q&Last=Public"
+
+
+    ### connect and send the server a path
+    httpobj = httplib.HTTP('www.some-server.out-there', 80)
+    httpobj.putrequest('POST', '/cgi-bin/some-cgi-script')
+    ### now generate the rest of the HTTP headers...
+    httpobj.putheader('Accept', '*/*')
+    httpobj.putheader('Connection', 'Keep-Alive')
+    httpobj.putheader('Content-type', 'application/x-www-form-urlencoded')
+    httpobj.putheader('Content-length', '%d' % len(qs))
+    httpobj.endheaders()
+    httpobj.send(qs)
+    ### find out what the server said in response...
+    reply, msg, hdrs = httpobj.getreply()
+    if reply != 200:
+	sys.stdout.write(httpobj.getfile().read())
+
+Note that in general for "url encoded posts" (the default) query strings must be "quoted" to, for example, change equals signs and spaces to an encoded form when they occur in name or value. Use urllib.quote to perform this quoting. For example to send name="Guy Steele, Jr.": +

+

+   >>> from urllib import quote
+   >>> x = quote("Guy Steele, Jr.")
+   >>> x
+   'Guy%20Steele,%20Jr.'
+   >>> query_string = "name="+x
+   >>> query_string
+   'name=Guy%20Steele,%20Jr.'
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 21 03:47:07 1999 by +TAB +

+ +


+

4.62. If my program crashes with a bsddb (or anydbm) database open, it gets corrupted. How come?

+Databases opened for write access with the bsddb module (and often by +the anydbm module, since it will preferentially use bsddb) must +explicitly be closed using the close method of the database. The +underlying libdb package caches database contents which need to be +converted to on-disk form and written, unlike regular open files which +already have the on-disk bits in the kernel's write buffer, where they +can just be dumped by the kernel with the program exits. +

+If you have initialized a new bsddb database but not written anything to +it before the program crashes, you will often wind up with a zero-length +file and encounter an exception the next time the file is opened. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 01:15:01 2002 by +Neal Norwitz +

+ +


+

4.63. How do I make a Python script executable on Unix?

+You need to do two things: the script file's mode must be executable +(include the 'x' bit), and the first line must begin with #! +followed by the pathname for the Python interpreter. +

+The first is done by executing 'chmod +x scriptfile' or perhaps +'chmod 755 scriptfile'. +

+The second can be done in a number of way. The most straightforward +way is to write +

+

+  #!/usr/local/bin/python
+
+as the very first line of your file - or whatever the pathname is +where the python interpreter is installed on your platform. +

+If you would like the script to be independent of where the python +interpreter lives, you can use the "env" program. On almost all +platforms, the following will work, assuming the python interpreter +is in a directory on the user's $PATH: +

+

+  #! /usr/bin/env python
+
+Note -- *don't* do this for CGI scripts. The $PATH variable for +CGI scripts is often very minimal, so you need to use the actual +absolute pathname of the interpreter. +

+Occasionally, a user's environment is so full that the /usr/bin/env +program fails; or there's no env program at all. +In that case, you can try the following hack (due to Alex Rezinsky): +

+

+  #! /bin/sh
+  """:"
+  exec python $0 ${1+"$@"}
+  """
+
+The disadvantage is that this defines the script's __doc__ string. +However, you can fix that by adding +

+

+  __doc__ = """...Whatever..."""
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jan 15 09:19:16 2001 by +Neal Norwitz +

+ +


+

4.64. How do you remove duplicates from a list?

+See the Python Cookbook for a long discussion of many cool ways: +

+

+    http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52560
+
+Generally, if you don't mind reordering the List +

+

+   if List:
+      List.sort()
+      last = List[-1]
+      for i in range(len(List)-2, -1, -1):
+          if last==List[i]: del List[i]
+          else: last=List[i]
+
+If all elements of the list may be used as +dictionary keys (ie, they are all hashable) +this is often faster +

+

+   d = {}
+   for x in List: d[x]=x
+   List = d.values()
+
+Also, for extremely large lists you might +consider more optimal alternatives to the first one. +The second one is pretty good whenever it can +be used. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 24 21:56:33 2002 by +Tim Peters +

+ +


+

4.65. Are there any known year 2000 problems in Python?

+I am not aware of year 2000 deficiencies in Python 1.5. Python does +very few date calculations and for what it does, it relies on the C +library functions. Python generally represent times either as seconds +since 1970 or as a tuple (year, month, day, ...) where the year is +expressed with four digits, which makes Y2K bugs unlikely. So as long +as your C library is okay, Python should be okay. Of course, I cannot +vouch for your Python code! +

+Given the nature of freely available software, I have to add that this statement is not +legally binding. The Python copyright notice contains the following +disclaimer: +

+

+  STICHTING MATHEMATISCH CENTRUM AND CNRI DISCLAIM ALL WARRANTIES WITH
+  REGARD TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF
+  MERCHANTABILITY AND FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH
+  CENTRUM OR CNRI BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL
+  DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR
+  PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER
+  TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
+  PERFORMANCE OF THIS SOFTWARE.
+
+The good news is that if you encounter a problem, you have full +source available to track it down and fix it! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Apr 10 14:59:31 1998 by +GvR +

+ +


+

4.66. I want a version of map that applies a method to a sequence of objects! Help!

+Get fancy! +

+

+  def method_map(objects, method, arguments):
+       """method_map([a,b], "flog", (1,2)) gives [a.flog(1,2), b.flog(1,2)]"""
+       nobjects = len(objects)
+       methods = map(getattr, objects, [method]*nobjects)
+       return map(apply, methods, [arguments]*nobjects)
+
+It's generally a good idea to get to know the mysteries of map and apply +and getattr and the other dynamic features of Python. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jan 5 14:21:14 1998 by +Aaron Watters +

+ +


+

4.67. How do I generate random numbers in Python?

+The standard library module "random" implements a random number +generator. Usage is simple: +

+

+    import random
+
+
+    random.random()
+
+This returns a random floating point number in the range [0, 1). +

+There are also many other specialized generators in this module, such +as +

+

+    randrange(a, b) chooses an integer in the range [a, b)
+    uniform(a, b) chooses a floating point number in the range [a, b)
+    normalvariate(mean, sdev) sample from normal (Gaussian) distribution
+
+Some higher-level functions operate on sequences directly, such as +

+

+    choice(S) chooses random element from a given sequence
+    shuffle(L) shuffles a list in-place, i.e. permutes it randomly
+
+There's also a class, Random, which you can instantiate +to create independent multiple random number generators. +

+All this is documented in the library reference manual. Note that +the module "whrandom" is obsolete. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 01:16:51 2002 by +Neal Norwitz +

+ +


+

4.68. How do I access the serial (RS232) port?

+There's a Windows serial communication module (for communication +over RS 232 serial ports) at +

+

+  ftp://ftp.python.org/pub/python/contrib/sio-151.zip
+  http://www.python.org/ftp/python/contrib/sio-151.zip
+
+For DOS, try Hans Nowak's Python-DX, which supports this, at: +

+

+  http://www.cuci.nl/~hnowak/
+
+For Unix, see a usenet post by Mitch Chapman: +

+

+  http://groups.google.com/groups?selm=34A04430.CF9@ohioee.com
+
+For Win32, POSIX(Linux, BSD, *), Jython, Chris': +

+

+  http://pyserial.sourceforge.net
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Jul 2 21:11:07 2002 by +Chris Liechti +

+ +


+

4.69. Images on Tk-Buttons don't work in Py15?

+They do work, but you must keep your own reference to the image +object now. More verbosely, you must make sure that, say, a global +variable or a class attribute refers to the object. +

+Quoting Fredrik Lundh from the mailinglist: +

+

+  Well, the Tk button widget keeps a reference to the internal
+  photoimage object, but Tkinter does not.  So when the last
+  Python reference goes away, Tkinter tells Tk to release the
+  photoimage.  But since the image is in use by a widget, Tk
+  doesn't destroy it.  Not completely.  It just blanks the image,
+  making it completely transparent...
+
+
+  And yes, there was a bug in the keyword argument handling
+  in 1.4 that kept an extra reference around in some cases.  And
+  when Guido fixed that bug in 1.5, he broke quite a few Tkinter
+  programs...
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Feb 3 11:31:03 1998 by +Case Roole +

+ +


+

4.70. Where is the math.py (socket.py, regex.py, etc.) source file?

+If you can't find a source file for a module it may be a builtin +or dynamically loaded module implemented in C, C++ or other +compiled language. In this case you may not have the source +file or it may be something like mathmodule.c, somewhere in +a C source directory (not on the Python Path). +

+Fredrik Lundh (fredrik@pythonware.com) explains (on the python-list): +

+There are (at least) three kinds of modules in Python: +1) modules written in Python (.py); +2) modules written in C and dynamically loaded (.dll, .pyd, .so, .sl, etc); +3) modules written in C and linked with the interpreter; to get a list +of these, type: +

+

+    import sys
+    print sys.builtin_module_names
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Feb 3 13:55:33 1998 by +Aaron Watters +

+ +


+

4.71. How do I send mail from a Python script?

+The standard library module smtplib does this. +Here's a very simple interactive mail +sender that uses it. This method will work on any host that +supports an SMTP listener. +

+

+    import sys, smtplib
+
+
+    fromaddr = raw_input("From: ")
+    toaddrs  = raw_input("To: ").split(',')
+    print "Enter message, end with ^D:"
+    msg = ''
+    while 1:
+        line = sys.stdin.readline()
+        if not line:
+            break
+        msg = msg + line
+
+
+    # The actual mail send
+    server = smtplib.SMTP('localhost')
+    server.sendmail(fromaddr, toaddrs, msg)
+    server.quit()
+
+If the local host doesn't have an SMTP listener, you need to find one. The simple method is to ask the user. Alternately, you can use the DNS system to find the mail gateway(s) responsible for the source address. +

+A Unix-only alternative uses sendmail. The location of the +sendmail program varies between systems; sometimes it is +/usr/lib/sendmail, sometime /usr/sbin/sendmail. The sendmail manual +page will help you out. Here's some sample code: +

+

+  SENDMAIL = "/usr/sbin/sendmail" # sendmail location
+  import os
+  p = os.popen("%s -t -i" % SENDMAIL, "w")
+  p.write("To: cary@ratatosk.org\n")
+  p.write("Subject: test\n")
+  p.write("\n") # blank line separating headers from body
+  p.write("Some text\n")
+  p.write("some more text\n")
+  sts = p.close()
+  if sts != 0:
+      print "Sendmail exit status", sts
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 07:05:12 2002 by +Matthias Urlichs +

+ +


+

4.72. How do I avoid blocking in connect() of a socket?

+The select module is widely known to help with asynchronous +I/O on sockets once they are connected. However, it is less +than common knowledge how to avoid blocking on the initial +connect() call. Jeremy Hylton has the following advice (slightly +edited): +

+To prevent the TCP connect from blocking, you can set the socket to +non-blocking mode. Then when you do the connect(), you will either +connect immediately (unlikely) or get an exception that contains the +errno. errno.EINPROGRESS indicates that the connection is in +progress, but hasn't finished yet. Different OSes will return +different errnos, so you're going to have to check. I can tell you +that different versions of Solaris return different errno values. +

+In Python 1.5 and later, you can use connect_ex() to avoid +creating an exception. It will just return the errno value. +

+To poll, you can call connect_ex() again later -- 0 or errno.EISCONN +indicate that you're connected -- or you can pass this socket to +select (checking to see if it is writeable). +

+ +Edit this entry / +Log info + +/ Last changed on Tue Feb 24 21:30:45 1998 by +GvR +

+ +


+

4.73. How do I specify hexadecimal and octal integers?

+To specify an octal digit, precede the octal value with a zero. For example, +to set the variable "a" to the octal value "10" (8 in decimal), type: +

+

+    >>> a = 010
+
+To verify that this works, you can type "a" and hit enter while in the +interpreter, which will cause Python to spit out the current value of "a" +in decimal: +

+

+    >>> a
+    8
+
+Hexadecimal is just as easy. Simply precede the hexadecimal number with a +zero, and then a lower or uppercase "x". Hexadecimal digits can be specified +in lower or uppercase. For example, in the Python interpreter: +

+

+    >>> a = 0xa5
+    >>> a
+    165
+    >>> b = 0XB2
+    >>> b
+    178
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Mar 3 12:53:16 1998 by +GvR +

+ +


+

4.74. How to get a single keypress at a time?

+For Windows, see question 8.2. Here is an answer for Unix (see also 4.94). +

+There are several solutions; some involve using curses, which is a +pretty big thing to learn. Here's a solution without curses, due +to Andrew Kuchling (adapted from code to do a PGP-style +randomness pool): +

+

+        import termios, sys, os
+        fd = sys.stdin.fileno()
+        old = termios.tcgetattr(fd)
+        new = termios.tcgetattr(fd)
+        new[3] = new[3] & ~termios.ICANON & ~termios.ECHO
+        new[6][termios.VMIN] = 1
+        new[6][termios.VTIME] = 0
+        termios.tcsetattr(fd, termios.TCSANOW, new)
+        s = ''    # We'll save the characters typed and add them to the pool.
+        try:
+            while 1:
+                c = os.read(fd, 1)
+                print "Got character", `c`
+                s = s+c
+        finally:
+            termios.tcsetattr(fd, termios.TCSAFLUSH, old)
+
+You need the termios module for any of this to work, and I've only +tried it on Linux, though it should work elsewhere. It turns off +stdin's echoing and disables canonical mode, and then reads a +character at a time from stdin, noting the time after each keystroke. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Oct 24 00:36:56 2002 by +chris +

+ +


+

4.75. How can I overload constructors (or methods) in Python?

+(This actually applies to all methods, but somehow the question +usually comes up first in the context of constructors.) +

+Where in C++ you'd write +

+

+    class C {
+        C() { cout << "No arguments\n"; }
+        C(int i) { cout << "Argument is " << i << "\n"; }
+    }
+
+in Python you have to write a single constructor that catches all +cases using default arguments. For example: +

+

+    class C:
+        def __init__(self, i=None):
+            if i is None:
+                print "No arguments"
+            else:
+                print "Argument is", i
+
+This is not entirely equivalent, but close enough in practice. +

+You could also try a variable-length argument list, e.g. +

+

+        def __init__(self, *args):
+            ....
+
+The same approach works for all method definitions. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Apr 20 11:55:55 1998 by +GvR +

+ +


+

4.76. How do I pass keyword arguments from one method to another?

+Use apply. For example: +

+

+    class Account:
+        def __init__(self, **kw):
+            self.accountType = kw.get('accountType')
+            self.balance = kw.get('balance')
+
+
+    class CheckingAccount(Account):
+        def __init__(self, **kw):
+            kw['accountType'] = 'checking'
+            apply(Account.__init__, (self,), kw)
+
+
+    myAccount = CheckingAccount(balance=100.00)
+
+In Python 2.0 you can call it directly using the new ** syntax: +

+

+    class CheckingAccount(Account):
+        def __init__(self, **kw):
+            kw['accountType'] = 'checking'
+            Account.__init__(self, **kw)
+
+or more generally: +

+

+ >>> def f(x, *y, **z):
+ ...  print x,y,z
+ ...
+ >>> Y = [1,2,3]
+ >>> Z = {'foo':3,'bar':None}
+ >>> f('hello', *Y, **Z)
+ hello (1, 2, 3) {'foo': 3, 'bar': None}
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 28 13:04:01 2000 by +Bjorn Pettersen +

+ +


+

4.77. What module should I use to help with generating HTML?

+Check out HTMLgen written by Robin Friedrich. It's a class library +of objects corresponding to all the HTML 3.2 markup tags. It's used +when you are writing in Python and wish to synthesize HTML pages for +generating a web or for CGI forms, etc. +

+It can be found in the FTP contrib area on python.org or on the +Starship. Use the search engines there to locate the latest version. +

+It might also be useful to consider DocumentTemplate, which offers clear +separation between Python code and HTML code. DocumentTemplate is part +of the Bobo objects publishing system (http:/www.digicool.com/releases) +but can be used independantly of course! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Aug 28 09:54:58 1998 by +GvR +

+ +


+

4.78. How do I create documentation from doc strings?

+Use gendoc, by Daniel Larson. See +

+http://starship.python.net/crew/danilo/ +

+It can create HTML from the doc strings in your Python source code. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Oct 7 17:15:51 2002 by +Phil Rittenhouse +

+ +


+

4.79. How do I read (or write) binary data?

+For complex data formats, it's best to use +use the struct module. It's documented in the library reference. +It allows you to take a string read from a file containing binary +data (usually numbers) and convert it to Python objects; and vice +versa. +

+For example, the following code reads two 2-byte integers +and one 4-byte integer in big-endian format from a file: +

+

+  import struct
+
+
+  f = open(filename, "rb")  # Open in binary mode for portability
+  s = f.read(8)
+  x, y, z = struct.unpack(">hhl", s)
+
+The '>' in the format string forces bin-endian data; the letter +'h' reads one "short integer" (2 bytes), and 'l' reads one +"long integer" (4 bytes) from the string. +

+For data that is more regular (e.g. a homogeneous list of ints or +floats), you can also use the array module, also documented +in the library reference. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Oct 7 09:16:45 1998 by +GvR +

+ +


+

4.80. I can't get key bindings to work in Tkinter

+An oft-heard complaint is that event handlers bound to events +with the bind() method don't get handled even when the appropriate +key is pressed. +

+The most common cause is that the widget to which the binding applies +doesn't have "keyboard focus". Check out the Tk documentation +for the focus command. Usually a widget is given the keyboard +focus by clicking in it (but not for labels; see the taketocus +option). +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jun 12 09:37:33 1998 by +GvR +

+ +


+

4.81. "import crypt" fails

+[Unix] +

+Starting with Python 1.5, the crypt module is disabled by default. +In order to enable it, you must go into the Python source tree and +edit the file Modules/Setup to enable it (remove a '#' sign in +front of the line starting with '#crypt'). Then rebuild. +You may also have to add the string '-lcrypt' to that same line. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 5 08:57:09 1998 by +GvR +

+ +


+

4.82. Are there coding standards or a style guide for Python programs?

+Yes, Guido has written the "Python Style Guide". See +http://www.python.org/doc/essays/styleguide.html +

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 29 09:50:27 1998 by +Joseph VanAndel +

+ +


+

4.83. How do I freeze Tkinter applications?

+Freeze is a tool to create stand-alone applications (see 4.28). +

+When freezing Tkinter applications, the applications will not be +truly stand-alone, as the application will still need the tcl and +tk libraries. +

+One solution is to ship the application with the tcl and tk libraries, +and point to them at run-time using the TCL_LIBRARY and TK_LIBRARY +environment variables. +

+To get truly stand-alone applications, the Tcl scripts that form +the library have to be integrated into the application as well. One +tool supporting that is SAM (stand-alone modules), which is part +of the Tix distribution (http://tix.mne.com). Build Tix with SAM +enabled, perform the appropriate call to Tclsam_init etc inside +Python's Modules/tkappinit.c, and link with libtclsam +and libtksam (you might include the Tix libraries as well). +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jan 20 17:35:01 1999 by +Martin v. Löwis +

+ +


+

4.84. How do I create static class data and static class methods?

+[Tim Peters, tim_one@email.msn.com] +

+Static data (in the sense of C++ or Java) is easy; static methods (again in the sense of C++ or Java) are not supported directly. +

+STATIC DATA +

+For example, +

+

+    class C:
+        count = 0   # number of times C.__init__ called
+
+
+        def __init__(self):
+            C.count = C.count + 1
+
+
+        def getcount(self):
+            return C.count  # or return self.count
+
+c.count also refers to C.count for any c such that isinstance(c, C) holds, unless overridden by c itself or by some class on the base-class search path from c.__class__ back to C. +

+Caution: within a method of C, +

+

+    self.count = 42
+
+creates a new and unrelated instance vrbl named "count" in self's own dict. So rebinding of a class-static data name needs the +

+

+    C.count = 314
+
+form whether inside a method or not. +

+

+STATIC METHODS +

+Static methods (as opposed to static data) are unnatural in Python, because +

+

+    C.getcount
+
+returns an unbound method object, which can't be invoked without supplying an instance of C as the first argument. +

+The intended way to get the effect of a static method is via a module-level function: +

+

+    def getcount():
+        return C.count
+
+If your code is structured so as to define one class (or tightly related class hierarchy) per module, this supplies the desired encapsulation. +

+Several tortured schemes for faking static methods can be found by searching DejaNews. Most people feel such cures are worse than the disease. Perhaps the least obnoxious is due to Pekka Pessi (mailto:ppessi@hut.fi): +

+

+    # helper class to disguise function objects
+    class _static:
+        def __init__(self, f):
+            self.__call__ = f
+
+
+    class C:
+        count = 0
+
+
+        def __init__(self):
+            C.count = C.count + 1
+
+
+        def getcount():
+            return C.count
+        getcount = _static(getcount)
+
+
+        def sum(x, y):
+            return x + y
+        sum = _static(sum)
+
+
+    C(); C()
+    c = C()
+    print C.getcount()  # prints 3
+    print c.getcount()  # prints 3
+    print C.sum(27, 15) # prints 42
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Jan 21 21:35:38 1999 by +Tim Peters +

+ +


+

4.85. __import__('x.y.z') returns <module 'x'>; how do I get z?

+Try +

+

+   __import__('x.y.z').y.z
+
+For more realistic situations, you may have to do something like +

+

+   m = __import__(s)
+   for i in string.split(s, ".")[1:]:
+       m = getattr(m, i)
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Jan 28 11:01:43 1999 by +GvR +

+ +


+

4.86. Basic thread wisdom

+Please note that there is no way to take advantage of +multiprocessor hardware using the Python thread model. The interpreter +uses a global interpreter lock (GIL), +which does not allow multiple threads to be concurrently active. +

+If you write a simple test program like this: +

+

+  import thread
+  def run(name, n):
+      for i in range(n): print name, i
+  for i in range(10):
+      thread.start_new(run, (i, 100))
+
+none of the threads seem to run! The reason is that as soon as +the main thread exits, all threads are killed. +

+A simple fix is to add a sleep to the end of the program, +sufficiently long for all threads to finish: +

+

+  import thread, time
+  def run(name, n):
+      for i in range(n): print name, i
+  for i in range(10):
+      thread.start_new(run, (i, 100))
+  time.sleep(10) # <----------------------------!
+
+But now (on many platforms) the threads don't run in parallel, +but appear to run sequentially, one at a time! The reason is +that the OS thread scheduler doesn't start a new thread until +the previous thread is blocked. +

+A simple fix is to add a tiny sleep to the start of the run +function: +

+

+  import thread, time
+  def run(name, n):
+      time.sleep(0.001) # <---------------------!
+      for i in range(n): print name, i
+  for i in range(10):
+      thread.start_new(run, (i, 100))
+  time.sleep(10)
+
+Some more hints: +

+Instead of using a time.sleep() call at the end, it's +better to use some kind of semaphore mechanism. One idea is to +use a the Queue module to create a queue object, let each thread +append a token to the queue when it finishes, and let the main +thread read as many tokens from the queue as there are threads. +

+Use the threading module instead of the thread module. It's part +of Python since version 1.5.1. It takes care of all these details, +and has many other nice features too! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Feb 7 16:21:55 2003 by +GvR +

+ +


+

4.87. Why doesn't closing sys.stdout (stdin, stderr) really close it?

+Python file objects are a high-level layer of abstraction on top of C streams, which in turn are a medium-level layer of abstraction on top of (among other things) low-level C file descriptors. +

+For most file objects f you create in Python via the builtin "open" function, f.close() marks the Python file object as being closed from Python's point of view, and also arranges to close the underlying C stream. This happens automatically too, in f's destructor, when f becomes garbage. +

+But stdin, stdout and stderr are treated specially by Python, because of the special status also given to them by C: doing +

+

+    sys.stdout.close() # ditto for stdin and stderr
+
+marks the Python-level file object as being closed, but does not close the associated C stream (provided sys.stdout is still bound to its default value, which is the stream C also calls "stdout"). +

+To close the underlying C stream for one of these three, you should first be sure that's what you really want to do (e.g., you may confuse the heck out of extension modules trying to do I/O). If it is, use os.close: +

+

+    os.close(0)   # close C's stdin stream
+    os.close(1)   # close C's stdout stream
+    os.close(2)   # close C's stderr stream
+
+

+ +Edit this entry / +Log info + +/ Last changed on Sat Apr 17 02:22:35 1999 by +Tim Peters +

+ +


+

4.88. What kinds of global value mutation are thread-safe?

+[adapted from c.l.py responses by Gordon McMillan & GvR] +

+A global interpreter lock (GIL) is used internally to ensure that only one thread runs in the Python VM at a time. In general, Python offers to switch among threads only between bytecode instructions (how frequently it offers to switch can be set via sys.setcheckinterval). Each bytecode instruction-- and all the C implementation code reached from it --is therefore atomic. +

+In theory, this means an exact accounting requires an exact understanding of the PVM bytecode implementation. In practice, it means that operations on shared vrbls of builtin data types (ints, lists, dicts, etc) that "look atomic" really are. +

+For example, these are atomic (L, L1, L2 are lists, D, D1, D2 are dicts, x, y +are objects, i, j are ints): +

+

+    L.append(x)
+    L1.extend(L2)
+    x = L[i]
+    x = L.pop()
+    L1[i:j] = L2
+    L.sort()
+    x = y
+    x.field = y
+    D[x] = y
+    D1.update(D2)
+    D.keys()
+
+These aren't: +

+

+    i = i+1
+    L.append(L[-1])
+    L[i] = L[j]
+    D[x] = D[x] + 1
+
+Note: operations that replace other objects may invoke those other objects' __del__ method when their reference count reaches zero, and that can affect things. This is especially true for the mass updates to dictionaries and lists. When in doubt, use a mutex! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Feb 7 16:21:03 2003 by +GvR +

+ +


+

4.89. How do I modify a string in place?

+Strings are immutable (see question 6.2) so you cannot modify a string +directly. If you need an object with this ability, try converting the +string to a list or take a look at the array module. +

+

+    >>> s = "Hello, world"
+    >>> a = list(s)
+    >>> print a
+    ['H', 'e', 'l', 'l', 'o', ',', ' ', 'w', 'o', 'r', 'l', 'd']
+    >>> a[7:] = list("there!")
+    >>> import string
+    >>> print string.join(a, '')
+    'Hello, there!'
+
+
+    >>> import array
+    >>> a = array.array('c', s)
+    >>> print a
+    array('c', 'Hello, world')
+    >>> a[0] = 'y' ; print a
+    array('c', 'yello world')
+    >>> a.tostring()
+    'yello, world'
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue May 18 01:22:47 1999 by +Andrew Dalke +

+ +


+

4.90. How to pass on keyword/optional parameters/arguments

+Q: How can I pass on optional or keyword parameters from one function to another? +

+

+	def f1(a, *b, **c):
+		...
+
+A: In Python 2.0 and above: +

+

+	def f2(x, *y, **z):
+		...
+		z['width']='14.3c'
+		...
+		f1(x, *y, **z)
+
+
+   Note: y can be any sequence (e.g., list or tuple) and z must be a dict.
+
+

+A: For versions prior to 2.0, use 'apply', like: +

+

+	def f2(x, *y, **z):
+		...
+		z['width']='14.3c'
+		...
+		apply(f1, (x,)+y, z)
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 07:20:56 2002 by +Matthias Urlichs +

+ +


+

4.91. How can I get a dictionary to display its keys in a consistent order?

+In general, dictionaries store their keys in an unpredictable order, +so the display order of a dictionary's elements will be similarly +unpredictable. +(See +Question 6.12 +to understand why this is so.) +

+This can be frustrating if you want to save a printable version to a +file, make some changes and then compare it with some other printed +dictionary. If you have such needs you can subclass UserDict.UserDict +to create a SortedDict class that prints itself in a predictable order. +Here's one simpleminded implementation of such a class: +

+

+  import UserDict, string
+
+
+  class SortedDict(UserDict.UserDict):
+    def __repr__(self):
+      result = []
+      append = result.append
+      keys = self.data.keys()
+      keys.sort()
+      for k in keys:
+        append("%s: %s" % (`k`, `self.data[k]`))
+      return "{%s}" % string.join(result, ", ")
+
+
+    ___str__ = __repr__
+
+

+This will work for many common situations you might encounter, though +it's far from a perfect solution. (It won't have any effect on the +pprint module and does not transparently handle values that are or +contain dictionaries. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Sep 16 17:31:06 1999 by +Skip Montanaro +

+ +


+

4.92. Is there a Python tutorial?

+Yes. See question 1.20 at +http://www.python.org/doc/FAQ.html#1.20 +

+ +Edit this entry / +Log info + +/ Last changed on Sat Dec 4 16:04:00 1999 by +TAB +

+ +


+

4.93. Deleted

+See 4.28 +

+ +Edit this entry / +Log info + +/ Last changed on Tue May 28 20:40:37 2002 by +GvR +

+ +


+

4.94. How do I get a single keypress without blocking?

+There are several solutions; some involve using curses, which is a +pretty big thing to learn. Here's a solution without curses. (see also 4.74, for Windows, see question 8.2) +

+

+  import termios, fcntl, sys, os
+  fd = sys.stdin.fileno()
+
+
+  oldterm = termios.tcgetattr(fd)
+  newattr = termios.tcgetattr(fd)
+  newattr[3] = newattr[3] & ~termios.ICANON & ~termios.ECHO
+  termios.tcsetattr(fd, termios.TCSANOW, newattr)
+
+
+  oldflags = fcntl.fcntl(fd, fcntl.F_GETFL)
+  fcntl.fcntl(fd, fcntl.F_SETFL, oldflags | os.O_NONBLOCK)
+
+
+  try:
+      while 1:
+          try:
+              c = sys.stdin.read(1)
+              print "Got character", `c`
+          except IOError: pass
+  finally:
+      termios.tcsetattr(fd, termios.TCSAFLUSH, oldterm)
+      fcntl.fcntl(fd, fcntl.F_SETFL, oldflags)
+
+

+You need the termios and the fcntl module for any of this to work, +and I've only tried it on Linux, though it should work elsewhere. +

+In this code, characters are read and printed one at a time. +

+termios.tcsetattr() turns off stdin's echoing and disables canonical +mode. fcntl.fnctl() is used to obtain stdin's file descriptor flags +and modify them for non-blocking mode. Since reading stdin when it is +empty results in an IOError, this error is caught and ignored. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Oct 24 00:39:06 2002 by +chris +

+ +


+

4.95. Is there an equivalent to Perl chomp()? (Remove trailing newline from string)

+There are two partial substitutes. If you want to remove all trailing +whitespace, use the method string.rstrip(). Otherwise, if there is only +one line in the string, use string.splitlines()[0]. +

+

+ -----------------------------------------------------------------------
+
+
+ rstrip() is too greedy, it strips all trailing white spaces.
+ splitlines() takes ControlM as line boundary.
+ Consider these strings as input:
+   "python python    \r\n"
+   "python\rpython\r\n"
+   "python python   \r\r\r\n"
+ The results from rstrip()/splitlines() are perhaps not what we want.
+
+
+ It seems re can perform this task.
+
+

+

+ #!/usr/bin/python 
+ # requires python2                                                             
+
+
+ import re, os, StringIO
+
+
+ lines=StringIO.StringIO(
+   "The Python Programming Language\r\n"
+   "The Python Programming Language \r \r \r\r\n"
+   "The\rProgramming\rLanguage\r\n"
+   "The\rProgramming\rLanguage\r\r\r\r\n"
+   "The\r\rProgramming\r\rLanguage\r\r\r\r\n"
+ )
+
+
+ ln=re.compile("(?:[\r]?\n|\r)$") # dos:\r\n, unix:\n, mac:\r, others: unknown
+ # os.linesep does not work if someone ftps(in binary mode) a dos/mac text file
+ # to your unix box
+ #ln=re.compile(os.linesep + "$")
+
+
+ while 1:
+   s=lines.readline()
+   if not s: break
+   print "1.(%s)" % `s.rstrip()`
+   print "2.(%s)" % `ln.sub( "", s, 1)`
+   print "3.(%s)" % `s.splitlines()[0]`
+   print "4.(%s)" % `s.splitlines()`
+   print
+
+
+ lines.close()
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 8 09:51:34 2001 by +Crystal +

+ +


+

4.96. Why is join() a string method when I'm really joining the elements of a (list, tuple, sequence)?

+Strings became much more like other standard types starting in release 1.6, when methods were added which give the same functionality that has always been available using the functions of the string module. These new methods have been widely accepted, but the one which appears to make (some) programmers feel uncomfortable is: +

+

+    ", ".join(['1', '2', '4', '8', '16'])
+
+which gives the result +

+

+    "1, 2, 4, 8, 16"
+
+There are two usual arguments against this usage. +

+The first runs along the lines of: "It looks really ugly using a method of a string literal (string constant)", to which the answer is that it might, but a string literal is just a fixed value. If the methods are to be allowed on names bound to strings there is no logical reason to make them unavailable on literals. Get over it! +

+The second objection is typically cast as: "I am really telling a sequence to join its members together with a string constant". Sadly, you aren't. For some reason there seems to be much less difficulty with having split() as a string method, since in that case it is easy to see that +

+

+    "1, 2, 4, 8, 16".split(", ")
+
+is an instruction to a string literal to return the substrings delimited by the given separator (or, by default, arbitrary runs of white space). In this case a Unicode string returns a list of Unicode strings, an ASCII string returns a list of ASCII strings, and everyone is happy. +

+join() is a string method because in using it you are telling the separator string to iterate over an arbitrary sequence, forming string representations of each of the elements, and inserting itself between the elements' representations. This method can be used with any argument which obeys the rules for sequence objects, inluding any new classes you might define yourself. +

+Because this is a string method it can work for Unicode strings as well as plain ASCII strings. If join() were a method of the sequence types then the sequence types would have to decide which type of string to return depending on the type of the separator. +

+If none of these arguments persuade you, then for the moment you can continue to use the join() function from the string module, which allows you to write +

+

+    string.join(['1', '2', '4', '8', '16'], ", ")
+
+You will just have to try and forget that the string module actually uses the syntax you are compaining about to implement the syntax you prefer! +

+ +Edit this entry / +Log info + +/ Last changed on Fri Aug 2 15:51:58 2002 by +Steve Holden +

+ +


+

4.97. How can my code discover the name of an object?

+Generally speaking, it can't, because objects don't really have names. The assignment statement does not store the assigned value in the name but a reference to it. Essentially, assignment creates a binding of a name to a value. The same is true of def and class statements, but in that case the value is a callable. Consider the following code: +

+

+    class A:
+        pass
+
+
+    B = A
+
+
+    a = B()
+    b = a
+    print b
+    <__main__.A instance at 016D07CC>
+    print a
+    <__main__.A instance at 016D07CC>
+
+

+Arguably the class has a name: even though it is bound to two names and invoked through the name B the created instance is still reported as an instance of class A. However, it is impossible to say whether the instance's name is a or b, since both names are bound to the same value. +

+Generally speaking it should not be necessary for your code to "know the names" of particular values. Unless you are deliberately writing introspective programs, this is usually an indication that a change of approach might be beneficial. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 8 03:53:39 2001 by +Steve Holden +

+ +


+

4.98. Why are floating point calculations so inaccurate?

+The development version of the Python Tutorial now contains an Appendix with more info: +
+    http://www.python.org/doc/current/tut/node14.html
+
+People are often very surprised by results like this: +

+

+ >>> 1.2-1.0
+ 0.199999999999999996
+
+And think it is a bug in Python. It's not. It's a problem caused by +the internal representation of a floating point number. A floating point +number is stored as a fixed number of binary digits. +

+In decimal math, there are many numbers that can't be represented +with a fixed number of decimal digits, i.e. +1/3 = 0.3333333333....... +

+In the binary case, 1/2 = 0.1, 1/4 = 0.01, 1/8 = 0.001, etc. There are +a lot of numbers that can't be represented. The digits are cut off at +some point. +

+Since Python 1.6, a floating point's repr() function prints as many +digits are necessary to make eval(repr(f)) == f true for any float f. +The str() function prints the more sensible number that was probably +intended: +

+

+ >>> 0.2
+ 0.20000000000000001
+ >>> print 0.2
+ 0.2
+
+Again, this has nothing to do with Python, but with the way the +underlying C platform handles floating points, and ultimately with +the inaccuracy you'll always have when writing down numbers of fixed +number of digit strings. +

+One of the consequences of this is that it is dangerous to compare +the result of some computation to a float with == ! +Tiny inaccuracies may mean that == fails. +

+Instead try something like this: +

+

+ epsilon = 0.0000000000001 # Tiny allowed error
+ expected_result = 0.4
+
+
+ if expected_result-epsilon <= computation() <= expected_result+epsilon:
+    ...
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Apr 1 22:18:47 2002 by +Fred Drake +

+ +


+

4.99. I tried to open Berkeley DB file, but bsddb produces bsddb.error: (22, 'Invalid argument'). Help! How can I restore my data?

+Don't panic! Your data are probably intact. The most frequent cause +for the error is that you tried to open an earlier Berkeley DB file +with a later version of the Berkeley DB library. +

+Many Linux systems now have all three versions of Berkeley DB +available. If you are migrating from version 1 to a newer version use +db_dump185 to dump a plain text version of the database. +If you are migrating from version 2 to version 3 use db2_dump to create +a plain text version of the database. In either case, use db_load to +create a new native database for the latest version installed on your +computer. If you have version 3 of Berkeley DB installed, you should +be able to use db2_load to create a native version 2 database. +

+You should probably move away from Berkeley DB version 1 files because +the hash file code contains known bugs that can corrupt your data. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 29 16:04:29 2001 by +Skip Montanaro +

+ +


+

4.100. What are the "best practices" for using import in a module?

+First, the standard modules are great. Use them! The standard Python library is large and varied. Using modules can save you time and effort and will reduce maintainenance cost of your code. (Other programs are dedicated to supporting and fixing bugs in the standard Python modules. Coworkers may also be familiar with themodules that you use, reducing the amount of time it takes them to understand your code.) +

+The rest of this answer is largely a matter of personal preference, but here's what some newsgroup posters said (thanks to all who responded) +

+In general, don't use +

+ from modulename import *
+
+Doing so clutters the importer's namespace. Some avoid this idiom even with the few modules that were designed to be imported in this manner. (Modules designed in this manner include Tkinter, thread, and wxPython.) +

+Import modules at the top of a file, one module per line. Doing so makes it clear what other modules your code requires and avoids questions of whether the module name is in scope. Using one import per line makes it easy to add and delete module imports. +

+Move imports into a local scope (such as at the top of a function definition) if there are a lot of imports, and you're trying to avoid the cost (lots of initialization time) of many imports. This technique is especially helpful if many of the imports are unnecessary depending on how the program executes. You may also want to move imports into a function if the modules are only ever used in that function. Note that loading a module the first time may be expensive (because of the one time initialization of the module) but that loading a module multiple times is virtually free (a couple of dictionary lookups). Even if the module name has gone out of scope, the module is probably available in sys.modules. Thus, there isn't really anything wrong with putting no imports at the module level (if they aren't needed) and putting all of the imports at the function level. +

+It is sometimes necessary to move imports to a function or class to avoid problems with circular imports. Gordon says: +

+ Circular imports are fine where both modules use the "import <module>"
+ form of import. They fail when the 2nd module wants to grab a name
+ out of the first ("from module import name") and the import is at
+ the top level. That's because names in the 1st are not yet available,
+ (the first module is busy importing the 2nd).  
+
+In this case, if the 2nd module is only used in one function, then the import can easily be moved into that function. By the time the import is called, the first module will have finished initializing, and the second module can do its import. +

+It may also be necessary to move imports out of the top level of code +if some of the modules are platform-specific. In that case, it may not even be possible to import all of the modules at the top of the file. In this case, importing the correct modules in the corresponding platform-specific code is a good option. +

+If only instances of a specific class uses a module, then it is reasonable to import the module in the class's __init__ method and then assign the module to an instance variable so that the module is always available (via that instance variable) during the life of the object. Note that to delay an import until the class is instantiated, the import must be inside a method. Putting the import inside the class but outside of any method still causes the import to occur when the module is initialized. +

+ +Edit this entry / +Log info + +/ Last changed on Sat Aug 4 04:44:47 2001 by +TAB +

+ +


+

4.101. Is there a tool to help find bugs or perform static analysis?

+Yes. PyChecker is a static analysis tool for finding bugs +in Python source code as well as warning about code complexity +and style. +

+You can get PyChecker from: http://pychecker.sf.net. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Aug 10 15:42:11 2001 by +Neal +

+ +


+

4.102. UnicodeError: ASCII [decoding,encoding] error: ordinal not in range(128)

+This error indicates that your Python installation can handle +only 7-bit ASCII strings. There are a couple ways to fix or +workaround the problem. +

+If your programs must handle data in arbitary character set encodings, the environment the application runs in will generally identify the encoding of the data it is handing you. You need to convert the input to Unicode data using that encoding. For instance, a program that handles email or web input will typically find character set encoding information in Content-Type headers. This can then be used to properly convert input data to Unicode. Assuming the string referred to by "value" is encoded as UTF-8: +

+

+    value = unicode(value, "utf-8")
+
+will return a Unicode object. If the data is not correctly encoded as UTF-8, the above call will raise a UnicodeError. +

+If you only want strings coverted to Unicode which have non-ASCII data, you can try converting them first assuming an ASCII encoding, and then generate Unicode objects if that fails: +

+

+    try:
+        x = unicode(value, "ascii")
+    except UnicodeError:
+        value = unicode(value, "utf-8")
+    else:
+        # value was valid ASCII data
+        pass
+
+

+If you normally use a character set encoding other than US-ASCII and only need to handle data in that encoding, the simplest way to fix the problem may be simply to set the encoding in sitecustomize.py. The following code is just a modified version of the encoding setup code from site.py with the relevant lines uncommented. +

+

+    # Set the string encoding used by the Unicode implementation.
+    # The default is 'ascii'
+    encoding = "ascii" # <= CHANGE THIS if you wish
+
+
+    # Enable to support locale aware default string encodings.
+    import locale
+    loc = locale.getdefaultlocale()
+    if loc[1]:
+        encoding = loc[1]
+    if encoding != "ascii":
+        import sys
+        sys.setdefaultencoding(encoding)
+
+

+Also note that on Windows, there is an encoding known as "mbcs", which uses an encoding specific to your current locale. In many cases, and particularly when working with COM, this may be an appropriate default encoding to use. +

+ +Edit this entry / +Log info + +/ Last changed on Sat Apr 13 04:45:41 2002 by +Skip Montanaro +

+ +


+

4.103. Using strings to call functions/methods

+There are various techniques: +

+* Use a dictionary pre-loaded with strings and functions. The primary +advantage of this technique is that the strings do not need to match the +names of the functions. This is also the primary technique used to +emulate a case construct: +

+

+    def a():
+        pass
+
+
+    def b():
+        pass
+
+
+    dispatch = {'go': a, 'stop': b}  # Note lack of parens for funcs
+
+
+    dispatch[get_input()]()  # Note trailing parens to call function
+
+* Use the built-in function getattr(): +

+

+    import foo
+    getattr(foo, 'bar')()
+
+Note that getattr() works on any object, including classes, class +instances, modules, and so on. +

+This is used in several places in the standard library, like +this: +

+

+    class Foo:
+        def do_foo(self):
+            ...
+
+
+        def do_bar(self):
+            ...
+
+
+     f = getattr(foo_instance, 'do_' + opname)
+     f()
+
+

+* Use locals() or eval() to resolve the function name: +

+def myFunc(): +

+    print "hello"
+
+fname = "myFunc" +

+f = locals()[fname] +f() +

+f = eval(fname) +f() +

+Note: Using eval() can be dangerous. If you don't have absolute control +over the contents of the string, all sorts of things could happen... +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 08:14:58 2002 by +Erno Kuusela +

+ +


+

4.104. How fast are exceptions?

+A try/except block is extremely efficient. Actually executing an +exception is expensive. In older versions of Python (prior to 2.0), it +was common to code this idiom: +

+

+    try:
+        value = dict[key]
+    except KeyError:
+        dict[key] = getvalue(key)
+        value = dict[key]
+
+This idiom only made sense when you expected the dict to have the key +95% of the time or more; other times, you coded it like this: +

+

+    if dict.has_key(key):
+        value = dict[key]
+    else:
+        dict[key] = getvalue(key)
+        value = dict[key]
+
+In Python 2.0 and higher, of course, you can code this as +

+

+    value = dict.setdefault(key, getvalue(key))
+
+However this evaluates getvalue(key) always, regardless of whether it's needed or not. So if it's slow or has a side effect you should use one of the above variants. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Dec 9 10:12:30 2002 by +Yeti +

+ +


+

4.105. Sharing global variables across modules

+The canonical way to share information across modules within a single +program is to create a special module (often called config or cfg). +Just import the config module in all modules of your application; the +module then becomes available as a global name. Because there is only +one instance of each module, any changes made to the module object get +reflected everywhere. For example: +

+config.py: +

+

+    pass
+
+mod.py: +

+

+    import config
+    config.x = 1
+
+main.py: +

+

+    import config
+    import mod
+    print config.x
+
+Note that using a module is also the basis for implementing the +Singleton design pattern, for the same reason. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Apr 23 23:07:19 2002 by +Aahz +

+ +


+

4.106. Why is cPickle so slow?

+Use the binary option. We'd like to make that the default, but it would +break backward compatibility: +

+

+    largeString = 'z' * (100 * 1024)
+    myPickle = cPickle.dumps(largeString, 1)
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Aug 22 19:54:25 2002 by +Aahz +

+ +


+

4.107. When importing module XXX, why do I get "undefined symbol: PyUnicodeUCS2_..." ?

+You are using a version of Python that uses a 4-byte representation for +Unicode characters, but the extension module you are importing (possibly +indirectly) was compiled using a Python that uses a 2-byte representation +for Unicode characters (the default). +

+If instead the name of the undefined symbol starts with PyUnicodeUCS4_, +the problem is the same by the relationship is reversed: Python was +built using 2-byte Unicode characters, and the extension module was +compiled using a Python with 4-byte Unicode characters. +

+This can easily occur when using pre-built extension packages. RedHat +Linux 7.x, in particular, provides a "python2" binary that is compiled +with 4-byte Unicode. This only causes the link failure if the extension +uses any of the PyUnicode_*() functions. It is also a problem if if an +extension uses any of the Unicode-related format specifiers for +Py_BuildValue (or similar) or parameter-specifications for +PyArg_ParseTuple(). +

+You can check the size of the Unicode character a Python interpreter is +using by checking the value of sys.maxunicode: +

+

+  >>> import sys
+  >>> if sys.maxunicode > 65535:
+  ...     print 'UCS4 build'
+  ... else:
+  ...     print 'UCS2 build'
+
+The only way to solve this problem is to use extension modules compiled +with a Python binary built using the same size for Unicode characters. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Aug 27 15:00:17 2002 by +Fred Drake +

+ +


+

4.108. How do I create a .pyc file?

+QUESTION: +

+I have a module and I wish to generate a .pyc file. +How do I do it? Everything I read says that generation of a .pyc file is +"automatic", but I'm not getting anywhere. +

+

+ANSWER: +

+When a module is imported for the first time (or when the source is more +recent than the current compiled file) a .pyc file containing the compiled code should be created in the +same directory as the .py file. +

+One reason that a .pyc file may not be created is permissions problems with the directory. This can happen, for example, if you develop as one user but run as another, such as if you are testing with a web server. +

+However, in most cases, that's not the problem. +

+Creation of a .pyc file is "automatic" if you are importing a module and Python has the +ability (permissions, free space, etc...) to write the compiled module +back to the directory. But note that running Python on a top level script is not considered an +import and so no .pyc will be created automatically. For example, if you have a top-level module abc.py that imports another module xyz.py, when you run abc, xyz.pyc will be created since xyz is imported, but no abc.pyc file will be created since abc isn't imported. +

+If you need to create abc.pyc -- that is, to create a .pyc file for a +module that is not imported -- you can. (Look up +the py_compile and compileall modules in the Library Reference.) +

+You can manually compile any module using the "py_compile" module. One +way is to use the compile() function in that module interactively: +

+

+    >>> import py_compile
+    >>> py_compile.compile('abc.py')
+
+This will write the .pyc to the same location as abc.py (or you +can override that with the optional parameter cfile). +

+You can also automatically compile all files in a directory or +directories using the "compileall" module, which can also be run +straight from the command line. +

+You can do it from the shell (or DOS) prompt by entering: +

+       python compile.py abc.py
+
+or +
+       python compile.py *
+
+Or you can write a script to do it on a list of filenames that you enter. +

+

+     import sys
+     from py_compile import compile
+
+
+     if len(sys.argv) <= 1:
+        sys.exit(1)
+
+
+     for file in sys.argv[1:]:
+        compile(file)
+
+ACKNOWLEDGMENTS: +

+Steve Holden, David Bolen, Rich Somerfield, Oleg Broytmann, Steve Ferg +

+ +Edit this entry / +Log info + +/ Last changed on Wed Feb 12 15:58:25 2003 by +Stephen Ferg +

+ +


+

5. Extending Python

+ +
+

5.1. Can I create my own functions in C?

+Yes, you can create built-in modules containing functions, +variables, exceptions and even new types in C. This is explained in +the document "Extending and Embedding the Python Interpreter" (http://www.python.org/doc/current/ext/ext.html). Also read the chapter +on dynamic loading. +

+There's more information on this in each of the Python books: +Programming Python, Internet Programming with Python, and Das Python-Buch +(in German). +

+ +Edit this entry / +Log info + +/ Last changed on Mon Dec 10 05:18:57 2001 by +Fred L. Drake, Jr. +

+ +


+

5.2. Can I create my own functions in C++?

+Yes, using the C-compatibility features found in C++. Basically +you place extern "C" { ... } around the Python include files and put +extern "C" before each function that is going to be called by the +Python interpreter. Global or static C++ objects with constructors +are probably not a good idea. +

+ +Edit this entry / +Log info +

+ +


+

5.3. How can I execute arbitrary Python statements from C?

+The highest-level function to do this is PyRun_SimpleString() which takes +a single string argument which is executed in the context of module +__main__ and returns 0 for success and -1 when an exception occurred +(including SyntaxError). If you want more control, use PyRun_String(); +see the source for PyRun_SimpleString() in Python/pythonrun.c. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 20:08:14 1997 by +Bill Tutt +

+ +


+

5.4. How can I evaluate an arbitrary Python expression from C?

+Call the function PyRun_String() from the previous question with the +start symbol eval_input (Py_eval_input starting with 1.5a1); it +parses an expression, evaluates it and returns its value. +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 22:23:18 1997 by +David Ascher +

+ +


+

5.5. How do I extract C values from a Python object?

+That depends on the object's type. If it's a tuple, +PyTupleSize(o) returns its length and PyTuple_GetItem(o, i) +returns its i'th item; similar for lists with PyListSize(o) +and PyList_GetItem(o, i). For strings, PyString_Size(o) returns +its length and PyString_AsString(o) a pointer to its value +(note that Python strings may contain null bytes so strlen() +is not safe). To test which type an object is, first make sure +it isn't NULL, and then use PyString_Check(o), PyTuple_Check(o), +PyList_Check(o), etc. +

+There is also a high-level API to Python objects which is +provided by the so-called 'abstract' interface -- read +Include/abstract.h for further details. It allows for example +interfacing with any kind of Python sequence (e.g. lists and tuples) +using calls like PySequence_Length(), PySequence_GetItem(), etc.) +as well as many other useful protocols. +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 22:34:20 1997 by +David Ascher +

+ +


+

5.6. How do I use Py_BuildValue() to create a tuple of arbitrary length?

+You can't. Use t = PyTuple_New(n) instead, and fill it with +objects using PyTuple_SetItem(t, i, o) -- note that this "eats" a +reference count of o. Similar for lists with PyList_New(n) and +PyList_SetItem(l, i, o). Note that you must set all the tuple items to +some value before you pass the tuple to Python code -- +PyTuple_New(n) initializes them to NULL, which isn't a valid Python +value. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jul 31 18:15:29 1997 by +Guido van Rossum +

+ +


+

5.7. How do I call an object's method from C?

+The PyObject_CallMethod() function can be used to call an arbitrary +method of an object. The parameters are the object, the name of the +method to call, a format string like that used with Py_BuildValue(), and the argument values: +

+

+    PyObject *
+    PyObject_CallMethod(PyObject *object, char *method_name,
+                        char *arg_format, ...);
+
+This works for any object that has methods -- whether built-in or +user-defined. You are responsible for eventually DECREF'ing the +return value. +

+To call, e.g., a file object's "seek" method with arguments 10, 0 +(assuming the file object pointer is "f"): +

+

+        res = PyObject_CallMethod(f, "seek", "(ii)", 10, 0);
+        if (res == NULL) {
+                ... an exception occurred ...
+        }
+        else {
+                Py_DECREF(res);
+        }
+
+Note that since PyObject_CallObject() always wants a tuple for the +argument list, to call a function without arguments, pass "()" for the +format, and to call a function with one argument, surround the argument +in parentheses, e.g. "(i)". +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jun 6 16:15:46 2002 by +Neal Norwitz +

+ +


+

5.8. How do I catch the output from PyErr_Print() (or anything that prints to stdout/stderr)?

+(Due to Mark Hammond): +

+In Python code, define an object that supports the "write()" method. +Redirect sys.stdout and sys.stderr to this object. +Call print_error, or just allow the standard traceback mechanism to +work. Then, the output will go wherever your write() method sends it. +

+The easiest way to do this is to use the StringIO class in the standard +library. +

+Sample code and use for catching stdout: +

+	>>> class StdoutCatcher:
+	...  def __init__(self):
+	...   self.data = ''
+	...  def write(self, stuff):
+	...   self.data = self.data + stuff
+	...  
+	>>> import sys
+	>>> sys.stdout = StdoutCatcher()
+	>>> print 'foo'
+	>>> print 'hello world!'
+	>>> sys.stderr.write(sys.stdout.data)
+	foo
+	hello world!
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Dec 16 18:34:25 1998 by +Richard Jones +

+ +


+

5.9. How do I access a module written in Python from C?

+You can get a pointer to the module object as follows: +

+

+        module = PyImport_ImportModule("<modulename>");
+
+If the module hasn't been imported yet (i.e. it is not yet present in +sys.modules), this initializes the module; otherwise it simply returns +the value of sys.modules["<modulename>"]. Note that it doesn't enter +the module into any namespace -- it only ensures it has been +initialized and is stored in sys.modules. +

+You can then access the module's attributes (i.e. any name defined in +the module) as follows: +

+

+        attr = PyObject_GetAttrString(module, "<attrname>");
+
+Calling PyObject_SetAttrString(), to assign to variables in the module, also works. +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 22:56:40 1997 by +david ascher +

+ +


+

5.10. How do I interface to C++ objects from Python?

+Depending on your requirements, there are many approaches. To do +this manually, begin by reading the "Extending and Embedding" document +(Doc/ext.tex, see also http://www.python.org/doc/). Realize +that for the Python run-time system, there isn't a whole lot of +difference between C and C++ -- so the strategy to build a new Python +type around a C structure (pointer) type will also work for C++ +objects. +

+A useful automated approach (which also works for C) is SWIG: +http://www.swig.org/. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Oct 15 05:14:01 1999 by +Sjoerd Mullender +

+ +


+

5.11. mSQLmodule (or other old module) won't build with Python 1.5 (or later)

+Since python-1.4 "Python.h" will have the file includes needed in an +extension module. +Backward compatibility is dropped after version 1.4 and therefore +mSQLmodule.c will not build as "allobjects.h" cannot be found. +The following change in mSQLmodule.c is harmless when building it with +1.4 and necessary when doing so for later python versions: +

+Remove lines: +

+

+	#include "allobjects.h"
+	#include "modsupport.h"
+
+And insert instead: +

+

+	#include "Python.h"
+
+You may also need to add +

+

+                #include "rename2.h"
+
+if the module uses "old names". +

+This may happen with other ancient python modules as well, +and the same fix applies. +

+ +Edit this entry / +Log info + +/ Last changed on Sun Dec 21 02:03:35 1997 by +GvR +

+ +


+

5.12. I added a module using the Setup file and the make fails! Huh?

+Setup must end in a newline, if there is no newline there it gets +very sad. Aside from this possibility, maybe you have other +non-Python-specific linkage problems. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jun 24 15:54:01 1997 by +aaron watters +

+ +


+

5.13. I want to compile a Python module on my Red Hat Linux system, but some files are missing.

+Red Hat's RPM for Python doesn't include the +/usr/lib/python1.x/config/ directory, which contains various files required +for compiling Python extensions. +Install the python-devel RPM to get the necessary files. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 26 13:44:04 1999 by +A.M. Kuchling +

+ +


+

5.14. What does "SystemError: _PyImport_FixupExtension: module yourmodule not loaded" mean?

+This means that you have created an extension module named "yourmodule", but your module init function does not initialize with that name. +

+Every module init function will have a line similar to: +

+

+  module = Py_InitModule("yourmodule", yourmodule_functions);
+
+If the string passed to this function is not the same name as your extenion module, the SystemError will be raised. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 25 07:16:08 1999 by +Mark Hammond +

+ +


+

5.15. How to tell "incomplete input" from "invalid input"?

+Sometimes you want to emulate the Python interactive interpreter's +behavior, where it gives you a continuation prompt when the input +is incomplete (e.g. you typed the start of an "if" statement +or you didn't close your parentheses or triple string quotes), +but it gives you a syntax error message immediately when the input +is invalid. +

+In Python you can use the codeop module, which approximates the +parser's behavior sufficiently. IDLE uses this, for example. +

+The easiest way to do it in C is to call PyRun_InteractiveLoop() +(in a separate thread maybe) and let the Python interpreter handle +the input for you. You can also set the PyOS_ReadlineFunctionPointer +to point at your custom input function. See Modules/readline.c and +Parser/myreadline.c for more hints. +

+However sometimes you have to run the embedded Python interpreter +in the same thread as your rest application and you can't allow the +PyRun_InteractiveLoop() to stop while waiting for user input. +The one solution then is to call PyParser_ParseString() +and test for e.error equal to E_EOF (then the input is incomplete). +Sample code fragment, untested, inspired by code from Alex Farber: +

+

+  #include <Python.h>
+  #include <node.h>
+  #include <errcode.h>
+  #include <grammar.h>
+  #include <parsetok.h>
+  #include <compile.h>
+
+
+  int testcomplete(char *code)
+    /* code should end in \n */
+    /* return -1 for error, 0 for incomplete, 1 for complete */
+  {
+    node *n;
+    perrdetail e;
+
+
+    n = PyParser_ParseString(code, &_PyParser_Grammar,
+                             Py_file_input, &e);
+    if (n == NULL) {
+      if (e.error == E_EOF) 
+        return 0;
+      return -1;
+    }
+
+
+    PyNode_Free(n);
+    return 1;
+  }
+
+Another solution is trying to compile the received string with +Py_CompileString(). If it compiles fine - try to execute the returned +code object by calling PyEval_EvalCode(). Otherwise save the input for +later. If the compilation fails, find out if it's an error or just +more input is required - by extracting the message string from the +exception tuple and comparing it to the "unexpected EOF while parsing". +Here is a complete example using the GNU readline library (you may +want to ignore SIGINT while calling readline()): +

+

+  #include <stdio.h>
+  #include <readline.h>
+
+
+  #include <Python.h>
+  #include <object.h>
+  #include <compile.h>
+  #include <eval.h>
+
+
+  int main (int argc, char* argv[])
+  {
+    int i, j, done = 0;                          /* lengths of line, code */
+    char ps1[] = ">>> ";
+    char ps2[] = "... ";
+    char *prompt = ps1;
+    char *msg, *line, *code = NULL;
+    PyObject *src, *glb, *loc;
+    PyObject *exc, *val, *trb, *obj, *dum;
+
+
+    Py_Initialize ();
+    loc = PyDict_New ();
+    glb = PyDict_New ();
+    PyDict_SetItemString (glb, "__builtins__", PyEval_GetBuiltins ());
+
+
+    while (!done)
+    {
+      line = readline (prompt);
+
+
+      if (NULL == line)                          /* CTRL-D pressed */
+      {
+        done = 1;
+      }
+      else
+      {
+        i = strlen (line);
+
+
+        if (i > 0)
+          add_history (line);                    /* save non-empty lines */
+
+
+        if (NULL == code)                        /* nothing in code yet */
+          j = 0;
+        else
+          j = strlen (code);
+
+
+        code = realloc (code, i + j + 2);
+        if (NULL == code)                        /* out of memory */
+          exit (1);
+
+
+        if (0 == j)                              /* code was empty, so */
+          code[0] = '\0';                        /* keep strncat happy */
+
+
+        strncat (code, line, i);                 /* append line to code */
+        code[i + j] = '\n';                      /* append '\n' to code */
+        code[i + j + 1] = '\0';
+
+
+        src = Py_CompileString (code, "<stdin>", Py_single_input);       
+
+
+        if (NULL != src)                         /* compiled just fine - */
+        {
+          if (ps1  == prompt ||                  /* ">>> " or */
+              '\n' == code[i + j - 1])           /* "... " and double '\n' */
+          {                                               /* so execute it */
+            dum = PyEval_EvalCode ((PyCodeObject *)src, glb, loc);
+            Py_XDECREF (dum);
+            Py_XDECREF (src);
+            free (code);
+            code = NULL;
+            if (PyErr_Occurred ())
+              PyErr_Print ();
+            prompt = ps1;
+          }
+        }                                        /* syntax error or E_EOF? */
+        else if (PyErr_ExceptionMatches (PyExc_SyntaxError))           
+        {
+          PyErr_Fetch (&exc, &val, &trb);        /* clears exception! */
+
+
+          if (PyArg_ParseTuple (val, "sO", &msg, &obj) &&
+              !strcmp (msg, "unexpected EOF while parsing")) /* E_EOF */
+          {
+            Py_XDECREF (exc);
+            Py_XDECREF (val);
+            Py_XDECREF (trb);
+            prompt = ps2;
+          }
+          else                                   /* some other syntax error */
+          {
+            PyErr_Restore (exc, val, trb);
+            PyErr_Print ();
+            free (code);
+            code = NULL;
+            prompt = ps1;
+          }
+        }
+        else                                     /* some non-syntax error */
+        {
+          PyErr_Print ();
+          free (code);
+          code = NULL;
+          prompt = ps1;
+        }
+
+
+        free (line);
+      }
+    }
+
+
+    Py_XDECREF(glb);
+    Py_XDECREF(loc);
+    Py_Finalize();
+    exit(0);
+  }
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Mar 15 09:47:24 2000 by +Alex Farber +

+ +


+

5.16. How do I debug an extension?

+When using gdb with dynamically loaded extensions, you can't set a +breakpoint in your extension until your extension is loaded. +

+In your .gdbinit file (or interactively), add the command +

+br _PyImport_LoadDynamicModule +

+

+$ gdb /local/bin/python +

+gdb) run myscript.py +

+gdb) continue # repeat until your extension is loaded +

+gdb) finish # so that your extension is loaded +

+gdb) br myfunction.c:50 +

+gdb) continue +

+ +Edit this entry / +Log info + +/ Last changed on Fri Oct 20 11:10:32 2000 by +Joe VanAndel +

+ +


+

5.17. How do I find undefined Linux g++ symbols, __builtin_new or __pure_virtural

+To dynamically load g++ extension modules, you must recompile python, relink python using g++ (change LINKCC in the python Modules Makefile), and link your extension module using g++ (e.g., "g++ -shared -o mymodule.so mymodule.o"). +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jan 14 18:03:51 2001 by +douglas orr +

+ +


+

5.18. How do I define and create objects corresponding to built-in/extension types

+Usually you would like to be able to inherit from a Python type when +you ask this question. The bottom line for Python 2.2 is: types and classes are miscible. You build instances by calling classes, and you can build subclasses to your heart's desire. +

+You need to be careful when instantiating immutable types like integers or strings. See http://www.amk.ca/python/2.2/, section 2, for details. +

+Prior to version 2.2, Python (like Java) insisted that there are first-class and second-class objects (the former are types, the latter classes), and never the twain shall meet. +

+The library has, however, done a good job of providing class wrappers for the more commonly desired objects (see UserDict, UserList and UserString for examples), and more are always welcome if you happen to be in the mood to write code. These wrappers still exist in Python 2.2. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 10 15:14:07 2002 by +Matthias Urlichs +

+ +


+

6. Python's design

+ +
+

6.1. Why isn't there a switch or case statement in Python?

+You can do this easily enough with a sequence of +if... elif... elif... else. There have been some proposals for switch +statement syntax, but there is no consensus (yet) on whether and how +to do range tests. +

+ +Edit this entry / +Log info +

+ +


+

6.2. Why does Python use indentation for grouping of statements?

+Basically I believe that using indentation for grouping is +extremely elegant and contributes a lot to the clarity of the average +Python program. Most people learn to love this feature after a while. +Some arguments for it: +

+Since there are no begin/end brackets there cannot be a disagreement +between grouping perceived by the parser and the human reader. I +remember long ago seeing a C fragment like this: +

+

+        if (x <= y)
+                x++;
+                y--;
+        z++;
+
+and staring a long time at it wondering why y was being decremented +even for x > y... (And I wasn't a C newbie then either.) +

+Since there are no begin/end brackets, Python is much less prone to +coding-style conflicts. In C there are loads of different ways to +place the braces (including the choice whether to place braces around +single statements in certain cases, for consistency). If you're used +to reading (and writing) code that uses one style, you will feel at +least slightly uneasy when reading (or being required to write) +another style. +Many coding styles place begin/end brackets on a line by themself. +This makes programs considerably longer and wastes valuable screen +space, making it harder to get a good overview over a program. +Ideally, a function should fit on one basic tty screen (say, 20 +lines). 20 lines of Python are worth a LOT more than 20 lines of C. +This is not solely due to the lack of begin/end brackets (the lack of +declarations also helps, and the powerful operations of course), but +it certainly helps! +

+ +Edit this entry / +Log info + +/ Last changed on Wed May 21 16:00:15 1997 by +GvR +

+ +


+

6.3. Why are Python strings immutable?

+There are two advantages. One is performance: knowing that a +string is immutable makes it easy to lay it out at construction time +-- fixed and unchanging storage requirements. (This is also one of +the reasons for the distinction between tuples and lists.) The +other is that strings in Python are considered as "elemental" as +numbers. No amount of activity will change the value 8 to anything +else, and in Python, no amount of activity will change the string +"eight" to anything else. (Adapted from Jim Roskind) +

+ +Edit this entry / +Log info +

+ +


+

6.4. Delete

+

+

+ +Edit this entry / +Log info + +/ Last changed on Tue Jan 2 03:05:25 2001 by +Moshe Zadka +

+ +


+

6.5. Why does Python use methods for some functionality (e.g. list.index()) but functions for other (e.g. len(list))?

+The major reason is history. Functions were used for those +operations that were generic for a group of types and which +were intended to work even for objects that didn't have +methods at all (e.g. numbers before type/class unification +began, or tuples). +

+It is also convenient to have a function that can readily be applied +to an amorphous collection of objects when you use the functional features of Python (map(), apply() et al). +

+In fact, implementing len(), max(), min() as a built-in function is +actually less code than implementing them as methods for each type. +One can quibble about individual cases but it's a part of Python, +and it's too late to change such things fundamentally now. The +functions have to remain to avoid massive code breakage. +

+Note that for string operations Python has moved from external functions +(the string module) to methods. However, len() is still a function. +

+ +Edit this entry / +Log info + +/ Last changed on Thu May 30 14:08:58 2002 by +Steve Holden +

+ +


+

6.6. Why can't I derive a class from built-in types (e.g. lists or files)?

+As of Python 2.2, you can derive from built-in types. For previous versions, the answer is: +

+This is caused by the relatively late addition of (user-defined) +classes to the language -- the implementation framework doesn't easily +allow it. See the answer to question 4.2 for a work-around. This +may be fixed in the (distant) future. +

+ +Edit this entry / +Log info + +/ Last changed on Thu May 23 02:53:22 2002 by +Neal Norwitz +

+ +


+

6.7. Why must 'self' be declared and used explicitly in method definitions and calls?

+So, is your current programming language C++ or Java? :-) +When classes were added to Python, this was (again) the simplest way of +implementing methods without too many changes to the interpreter. The +idea was borrowed from Modula-3. It turns out to be very useful, for +a variety of reasons. +

+First, it makes it more obvious that you are using a method or +instance attribute instead of a local variable. Reading "self.x" or +"self.meth()" makes it absolutely clear that an instance variable or +method is used even if you don't know the class definition by heart. +In C++, you can sort of tell by the lack of a local variable +declaration (assuming globals are rare or easily recognizable) -- but +in Python, there are no local variable declarations, so you'd have to +look up the class definition to be sure. +

+Second, it means that no special syntax is necessary if you want to +explicitly reference or call the method from a particular class. In +C++, if you want to use a method from base class that is overridden in +a derived class, you have to use the :: operator -- in Python you can +write baseclass.methodname(self, <argument list>). This is +particularly useful for __init__() methods, and in general in cases +where a derived class method wants to extend the base class method of +the same name and thus has to call the base class method somehow. +

+Lastly, for instance variables, it solves a syntactic problem with +assignment: since local variables in Python are (by definition!) those +variables to which a value assigned in a function body (and that +aren't explicitly declared global), there has to be some way to tell +the interpreter that an assignment was meant to assign to an instance +variable instead of to a local variable, and it should preferably be +syntactic (for efficiency reasons). C++ does this through +declarations, but Python doesn't have declarations and it would be a +pity having to introduce them just for this purpose. Using the +explicit "self.var" solves this nicely. Similarly, for using instance +variables, having to write "self.var" means that references to +unqualified names inside a method don't have to search the instance's +directories. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jan 12 08:01:50 2001 by +Steve Holden +

+ +


+

6.8. Can't you emulate threads in the interpreter instead of relying on an OS-specific thread implementation?

+Answer 1: Unfortunately, the interpreter pushes at least one C stack +frame for each Python stack frame. Also, extensions can call back into +Python at almost random moments. Therefore a complete threads +implementation requires thread support for C. +

+Answer 2: Fortunately, there is Stackless Python, which has a completely redesigned interpreter loop that avoids the C stack. It's still experimental but looks very promising. Although it is binary compatible with standard Python, it's still unclear whether Stackless will make it into the core -- maybe it's just too revolutionary. Stackless Python currently lives here: http://www.stackless.com. A microthread implementation that uses it can be found here: http://world.std.com/~wware/uthread.html. +

+ +Edit this entry / +Log info + +/ Last changed on Sat Apr 15 08:18:16 2000 by +Just van Rossum +

+ +


+

6.9. Why can't lambda forms contain statements?

+Python lambda forms cannot contain statements because Python's +syntactic framework can't handle statements nested inside expressions. +

+However, in Python, this is not a serious problem. Unlike lambda +forms in other languages, where they add functionality, Python lambdas +are only a shorthand notation if you're too lazy to define a function. +

+Functions are already first class objects in Python, and can be +declared in a local scope. Therefore the only advantage of using a +lambda form instead of a locally-defined function is that you don't need to invent a name for the function -- but that's just a local variable to which the function object (which is exactly the same type of object that a lambda form yields) is assigned! +

+ +Edit this entry / +Log info + +/ Last changed on Sun Jun 14 14:15:17 1998 by +Tim Peters +

+ +


+

6.10. [deleted]

+[lambda vs non-nested scopes used to be here] +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 05:20:56 2002 by +Erno Kuusela +

+ +


+

6.11. [deleted]

+[recursive functions vs non-nested scopes used to be here] +

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 05:22:04 2002 by +Erno Kuusela +

+ +


+

6.12. Why is there no more efficient way of iterating over a dictionary than first constructing the list of keys()?

+As of Python 2.2, you can now iterate over a dictionary directly, +using the new implied dictionary iterator: +

+

+    for k in d: ...
+
+There are also methods returning iterators over the values and items: +

+

+    for k in d.iterkeys(): # same as above
+    for v in d.itervalues(): # iterate over values
+    for k, v in d.iteritems(): # iterate over items
+
+All these require that you do not modify the dictionary during the loop. +

+For previous Python versions, the following defense should do: +

+Have you tried it? I bet it's fast enough for your purposes! In +most cases such a list takes only a few percent of the space occupied +by the dictionary. Apart from the fixed header, +the list needs only 4 bytes (the size of a pointer) per +key. A dictionary uses 12 bytes per key plus between 30 and 70 +percent hash table overhead, plus the space for the keys and values. +By necessity, all keys are distinct objects, and a string object (the most +common key type) costs at least 20 bytes plus the length of the +string. Add to that the values contained in the dictionary, and you +see that 4 bytes more per item really isn't that much more memory... +

+A call to dict.keys() makes one fast scan over the dictionary +(internally, the iteration function does exist) copying the pointers +to the key objects into a pre-allocated list object of the right size. +The iteration time isn't lost (since you'll have to iterate anyway -- +unless in the majority of cases your loop terminates very prematurely +(which I doubt since you're getting the keys in random order). +

+I don't expose the dictionary iteration operation to Python +programmers because the dictionary shouldn't be modified during the +entire iteration -- if it is, there's a small chance that the +dictionary is reorganized because the hash table becomes too full, and +then the iteration may miss some items and see others twice. Exactly +because this only occurs rarely, it would lead to hidden bugs in +programs: it's easy never to have it happen during test runs if you +only insert or delete a few items per iteration -- but your users will +surely hit upon it sooner or later. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 24 21:24:08 2002 by +GvR +

+ +


+

6.13. Can Python be compiled to machine code, C or some other language?

+Not easily. Python's high level data types, dynamic typing of +objects and run-time invocation of the interpreter (using eval() or +exec) together mean that a "compiled" Python program would probably +consist mostly of calls into the Python run-time system, even for +seemingly simple operations like "x+1". +

+Several projects described in the Python newsgroup or at past +Python conferences have shown that this approach is feasible, +although the speedups reached so far are only modest (e.g. 2x). +JPython uses the same strategy for compiling to Java bytecode. +(Jim Hugunin has demonstrated that in combination with whole-program +analysis, speedups of 1000x are feasible for small demo programs. +See the website for the 1997 Python conference.) +

+Internally, Python source code is always translated into a "virtual +machine code" or "byte code" representation before it is interpreted +(by the "Python virtual machine" or "bytecode interpreter"). In order +to avoid the overhead of parsing and translating modules that rarely +change over and over again, this byte code is written on a file whose +name ends in ".pyc" whenever a module is parsed (from a file whose +name ends in ".py"). When the corresponding .py file is changed, it +is parsed and translated again and the .pyc file is rewritten. +

+There is no performance difference once the .pyc file has been loaded +(the bytecode read from the .pyc file is exactly the same as the bytecode +created by direct translation). The only difference is that loading +code from a .pyc file is faster than parsing and translating a .py +file, so the presence of precompiled .pyc files will generally improve +start-up time of Python scripts. If desired, the Lib/compileall.py +module/script can be used to force creation of valid .pyc files for a +given set of modules. +

+Note that the main script executed by Python, even if its filename +ends in .py, is not compiled to a .pyc file. It is compiled to +bytecode, but the bytecode is not saved to a file. +

+If you are looking for a way to translate Python programs in order to +distribute them in binary form, without the need to distribute the +interpreter and library as well, have a look at the freeze.py script +in the Tools/freeze directory. This creates a single binary file +incorporating your program, the Python interpreter, and those parts of +the Python library that are needed by your program. Of course, the +resulting binary will only run on the same type of platform as that +used to create it. +

+Newsflash: there are now several programs that do this, to some extent. +Look for Psyco, Pyrex, PyInline, Py2Cmod, and Weave. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 24 21:26:19 2002 by +GvR +

+ +


+

6.14. How does Python manage memory?

+The details of Python memory management depend on the implementation. +The standard Python implementation (the C implementation) uses reference +counting and another mechanism to collect reference cycles. +

+Jython relies on the Java runtime; so it uses +the JVM's garbage collector. This difference can cause some subtle +porting problems if your Python code depends on the behavior of +the reference counting implementation. +

+The reference cycle collector was added in CPython 2.0. It +periodically executes a cycle detection algorithm which looks for inaccessible cycles and deletes the objects involved. A new gc module provides functions to perform a garbage collection, obtain debugging statistics, and tuning the collector's parameters. +

+The detection of cycles can be disabled when Python is compiled, if you can't afford even a tiny speed penalty or suspect that the cycle collection is buggy, by specifying the "--without-cycle-gc" switch when running the configure script. +

+Sometimes objects get stuck in "tracebacks" temporarily and hence are not deallocated when you might expect. Clear the tracebacks via +

+

+       import sys
+       sys.exc_traceback = sys.last_traceback = None
+
+Tracebacks are used for reporting errors and implementing debuggers and related things. They contain a portion of the program state extracted during the handling of an exception (usually the most recent exception). +

+In the absence of circularities and modulo tracebacks, Python programs need not explicitly manage memory. +

+Why python doesn't use a more traditional garbage collection +scheme? For one thing, unless this were +added to C as a standard feature, it's a portability pain in the ass. +And yes, I know about the Xerox library. It has bits of assembler +code for most common platforms. Not for all. And although it is +mostly transparent, it isn't completely transparent (when I once +linked Python with it, it dumped core). +

+Traditional GC also becomes a problem when Python gets embedded into +other applications. While in a stand-alone Python it may be fine to +replace the standard malloc() and free() with versions provided by the +GC library, an application embedding Python may want to have its own +substitute for malloc() and free(), and may not want Python's. Right +now, Python works with anything that implements malloc() and free() +properly. +

+In Jython, the following code (which is +fine in C Python) will probably run out of file descriptors long before +it runs out of memory: +

+

+        for file in <very long list of files>:
+                f = open(file)
+                c = f.read(1)
+
+Using the current reference counting and destructor scheme, each new +assignment to f closes the previous file. Using GC, this is not +guaranteed. Sure, you can think of ways to fix this. But it's not +off-the-shelf technology. If you want to write code that will +work with any Python implementation, you should explicitly close +the file; this will work regardless of GC: +

+

+       for file in <very long list of files>:
+                f = open(file)
+                c = f.read(1)
+                f.close()
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Mar 21 05:35:38 2002 by +Erno Kuusela +

+ +


+

6.15. Why are there separate tuple and list data types?

+This is done so that tuples can be immutable while lists are mutable. +

+Immutable tuples are useful in situations where you need to pass a few +items to a function and don't want the function to modify the tuple; +for example, +

+

+	point1 = (120, 140)
+	point2 = (200, 300)
+	record(point1, point2)
+	draw(point1, point2)
+
+You don't want to have to think about what would happen if record() +changed the coordinates -- it can't, because the tuples are immutable. +

+On the other hand, when creating large lists dynamically, it is +absolutely crucial that they are mutable -- adding elements to a tuple +one by one requires using the concatenation operator, which makes it +quadratic in time. +

+As a general guideline, use tuples like you would use structs in C or +records in Pascal, use lists like (variable length) arrays. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 15:26:03 1997 by +GvR +

+ +


+

6.16. How are lists implemented?

+Despite what a Lisper might think, Python's lists are really +variable-length arrays. The implementation uses a contiguous +array of references to other objects, and keeps a pointer +to this array (as well as its length) in a list head structure. +

+This makes indexing a list (a[i]) an operation whose cost is +independent of the size of the list or the value of the index. +

+When items are appended or inserted, the array of references is resized. +Some cleverness is applied to improve the performance of appending +items repeatedly; when the array must be grown, some extra space +is allocated so the next few times don't require an actual resize. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 15:32:24 1997 by +GvR +

+ +


+

6.17. How are dictionaries implemented?

+Python's dictionaries are implemented as resizable hash tables. +

+Compared to B-trees, this gives better performance for lookup +(the most common operation by far) under most circumstances, +and the implementation is simpler. +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 23:51:14 1997 by +Vladimir Marangozov +

+ +


+

6.18. Why must dictionary keys be immutable?

+The hash table implementation of dictionaries uses a hash value +calculated from the key value to find the key. If the key were +a mutable object, its value could change, and thus its hash could +change. But since whoever changes the key object can't tell that +is incorporated in a dictionary, it can't move the entry around in +the dictionary. Then, when you try to look up the same object +in the dictionary, it won't be found, since its hash value is different; +and if you try to look up the old value, it won't be found either, +since the value of the object found in that hash bin differs. +

+If you think you need to have a dictionary indexed with a list, +try to use a tuple instead. The function tuple(l) creates a tuple +with the same entries as the list l. +

+Some unacceptable solutions that have been proposed: +

+- Hash lists by their address (object ID). This doesn't work because +if you construct a new list with the same value it won't be found; +e.g., +

+

+  d = {[1,2]: '12'}
+  print d[[1,2]]
+
+will raise a KeyError exception because the id of the [1,2] used +in the second line differs from that in the first line. +In other words, dictionary keys should be compared using '==', not using 'is'. +

+- Make a copy when using a list as a key. This doesn't work because +the list (being a mutable object) could contain a reference to itself, +and then the copying code would run into an infinite loop. +

+- Allow lists as keys but tell the user not to modify them. This would +allow a class of hard-to-track bugs in programs that I'd rather not see; +it invalidates an important invariant of dictionaries (every value in +d.keys() is usable as a key of the dictionary). +

+- Mark lists as read-only once they are used as a dictionary key. +The problem is that it's not just the top-level object that could change +its value; you could use a tuple containing a list as a key. Entering +anything as a key into a dictionary would require marking all objects +reachable from there as read-only -- and again, self-referential objects +could cause an infinite loop again (and again and again). +

+There is a trick to get around this if you need to, but +use it at your own risk: You +can wrap a mutable structure inside a class instance which +has both a __cmp__ and a __hash__ method. +

+

+   class listwrapper:
+        def __init__(self, the_list):
+              self.the_list = the_list
+        def __cmp__(self, other):
+              return self.the_list == other.the_list
+        def __hash__(self):
+              l = self.the_list
+              result = 98767 - len(l)*555
+              for i in range(len(l)):
+                   try:
+                        result = result + (hash(l[i]) % 9999999) * 1001 + i
+                   except:
+                        result = (result % 7777777) + i * 333
+              return result
+
+Note that the hash computation is complicated by the +possibility that some members of the list may be unhashable +and also by the possibility of arithmetic overflow. +

+You must make +sure that the hash value for all such wrapper objects that reside in a +dictionary (or other hash based structure), remain fixed while +the object is in the dictionary (or other structure). +

+Furthermore it must always be the case that if +o1 == o2 (ie o1.__cmp__(o2)==0) then hash(o1)==hash(o2) +(ie, o1.__hash__() == o2.__hash__()), regardless of whether +the object is in a dictionary or not. +If you fail to meet these restrictions dictionaries and other +hash based structures may misbehave! +

+In the case of listwrapper above whenever the wrapper +object is in a dictionary the wrapped list must not change +to avoid anomalies. Don't do this unless you are prepared +to think hard about the requirements and the consequences +of not meeting them correctly. You've been warned! +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jul 10 10:08:40 1997 by +aaron watters +

+ +


+

6.19. How the heck do you make an array in Python?

+["this", 1, "is", "an", "array"] +

+Lists are arrays in the C or Pascal sense of the word (see question +6.16). The array module also provides methods for creating arrays +of fixed types with compact representations (but they are slower to +index than lists). Also note that the Numerics extensions and others +define array-like structures with various characteristics as well. +

+To get Lisp-like lists, emulate cons cells +

+

+    lisp_list = ("like",  ("this",  ("example", None) ) )
+
+using tuples (or lists, if you want mutability). Here the analogue +of lisp car is lisp_list[0] and the analogue of cdr is lisp_list[1]. +Only do this if you're sure you really need to (it's usually a lot +slower than using Python lists). +

+Think of Python lists as mutable heterogeneous arrays of +Python objects (say that 10 times fast :) ). +

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 13 07:08:27 1997 by +aaron watters +

+ +


+

6.20. Why doesn't list.sort() return the sorted list?

+In situations where performance matters, making a copy of the list +just to sort it would be wasteful. Therefore, list.sort() sorts +the list in place. In order to remind you of that fact, it does +not return the sorted list. This way, you won't be fooled into +accidentally overwriting a list when you need a sorted copy but also +need to keep the unsorted version around. +

+As a result, here's the idiom to iterate over the keys of a dictionary +in sorted order: +

+

+	keys = dict.keys()
+	keys.sort()
+	for key in keys:
+		...do whatever with dict[key]...
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Dec 2 17:01:52 1999 by +Fred L. Drake, Jr. +

+ +


+

6.21. How do you specify and enforce an interface spec in Python?

+An interfaces specification for a module as provided +by languages such as C++ and java describes the prototypes +for the methods and functions of the module. Many feel +that compile time enforcement of interface specifications +help aid in the construction of large programs. Python +does not support interface specifications directly, but many +of their advantages can be obtained by an appropriate +test discipline for components, which can often be very +easily accomplished in Python. There is also a tool, PyChecker, +which can be used to find problems due to subclassing. +

+A good test suite for a module can at +once provide a regression test and serve as a module interface +specification (even better since it also gives example usage). Look to +many of the standard libraries which often have a "script +interpretation" which provides a simple "self test." Even +modules which use complex external interfaces can often +be tested in isolation using trivial "stub" emulations of the +external interface. +

+An appropriate testing discipline (if enforced) can help +build large complex applications in Python as well as having interface +specifications would do (or better). Of course Python allows you +to get sloppy and not do it. Also you might want to design +your code with an eye to make it easily tested. +

+ +Edit this entry / +Log info + +/ Last changed on Thu May 23 03:05:29 2002 by +Neal Norwitz +

+ +


+

6.22. Why do all classes have the same type? Why do instances all have the same type?

+The Pythonic use of the word "type" is quite different from +common usage in much of the rest of the programming language +world. A "type" in Python is a description for an object's operations +as implemented in C. All classes have the same operations +implemented in C which sometimes "call back" to differing program +fragments implemented in Python, and hence all classes have the +same type. Similarly at the C level all class instances have the +same C implementation, and hence all instances have the same +type. +

+Remember that in Python usage "type" refers to a C implementation +of an object. To distinguish among instances of different classes +use Instance.__class__, and also look to 4.47. Sorry for the +terminological confusion, but at this point in Python's development +nothing can be done! +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jul 1 12:35:47 1997 by +aaron watters +

+ +


+

6.23. Why isn't all memory freed when Python exits?

+Objects referenced from Python module global name spaces are +not always deallocated when Python exits. +

+This may happen if there are circular references (see question +4.17). There are also certain bits of memory that are allocated +by the C library that are impossible to free (e.g. a tool +like Purify will complain about these). +

+But in general, Python 1.5 and beyond +(in contrast with earlier versions) is quite agressive about +cleaning up memory on exit. +

+If you want to force Python to delete certain things on deallocation +use the sys.exitfunc hook to force those deletions. For example +if you are debugging an extension module using a memory analysis +tool and you wish to make Python deallocate almost everything +you might use an exitfunc like this one: +

+

+  import sys
+
+
+  def my_exitfunc():
+       print "cleaning up"
+       import sys
+       # do order dependant deletions here
+       ...
+       # now delete everything else in arbitrary order
+       for x in sys.modules.values():
+            d = x.__dict__
+            for name in d.keys():
+                 del d[name]
+
+
+  sys.exitfunc = my_exitfunc
+
+Other exitfuncs can be less drastic, of course. +

+(In fact, this one just does what Python now already does itself; +but the example of using sys.exitfunc to force cleanups is still +useful.) +

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 29 09:46:26 1998 by +GvR +

+ +


+

6.24. Why no class methods or mutable class variables?

+The notation +

+

+    instance.attribute(arg1, arg2)
+
+usually translates to the equivalent of +

+

+    Class.attribute(instance, arg1, arg2)
+
+where Class is a (super)class of instance. Similarly +

+

+    instance.attribute = value
+
+sets an attribute of an instance (overriding any attribute of a class +that instance inherits). +

+Sometimes programmers want to have +different behaviours -- they want a method which does not bind +to the instance and a class attribute which changes in place. +Python does not preclude these behaviours, but you have to +adopt a convention to implement them. One way to accomplish +this is to use "list wrappers" and global functions. +

+

+   def C_hello():
+         print "hello"
+
+
+   class C:
+        hello = [C_hello]
+        counter = [0]
+
+
+    I = C()
+
+Here I.hello[0]() acts very much like a "class method" and +I.counter[0] = 2 alters C.counter (and doesn't override it). +If you don't understand why you'd ever want to do this, that's +because you are pure of mind, and you probably never will +want to do it! This is dangerous trickery, not recommended +when avoidable. (Inspired by Tim Peter's discussion.) +

+In Python 2.2, you can do this using the new built-in operations +classmethod and staticmethod. +See http://www.python.org/2.2/descrintro.html#staticmethods +

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 11 15:59:37 2001 by +GvR +

+ +


+

6.25. Why are default values sometimes shared between objects?

+It is often expected that a function CALL creates new objects for default +values. This is not what happens. Default values are created when the +function is DEFINED, that is, there is only one such object that all +functions refer to. If that object is changed, subsequent calls to the +function will refer to this changed object. By definition, immutable objects +(like numbers, strings, tuples, None) are safe from change. Changes to mutable +objects (like dictionaries, lists, class instances) is what causes the +confusion. +

+Because of this feature it is good programming practice not to use mutable +objects as default values, but to introduce them in the function. +Don't write: +

+

+	def foo(dict={}):  # XXX shared reference to one dict for all calls
+	    ...
+
+but: +
+	def foo(dict=None):
+		if dict is None:
+			dict = {} # create a new dict for local namespace
+
+See page 182 of "Internet Programming with Python" for one discussion +of this feature. Or see the top of page 144 or bottom of page 277 in +"Programming Python" for another discussion. +

+ +Edit this entry / +Log info + +/ Last changed on Sat Aug 16 07:03:35 1997 by +Case Roole +

+ +


+

6.26. Why no goto?

+Actually, you can use exceptions to provide a "structured goto" +that even works across function calls. Many feel that exceptions +can conveniently emulate all reasonable uses of the "go" or "goto" +constructs of C, Fortran, and other languages. For example: +

+

+   class label: pass # declare a label
+   try:
+        ...
+        if (condition): raise label() # goto label
+        ...
+   except label: # where to goto
+        pass
+   ...
+
+This doesn't allow you to jump into the middle of a loop, but +that's usually considered an abuse of goto anyway. Use sparingly. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Sep 10 07:16:44 1997 by +aaron watters +

+ +


+

6.27. How do you make a higher order function in Python?

+You have two choices: you can use default arguments and override +them or you can use "callable objects." For example suppose you +wanted to define linear(a,b) which returns a function f where f(x) +computes the value a*x+b. Using default arguments: +

+

+     def linear(a,b):
+         def result(x, a=a, b=b):
+             return a*x + b
+         return result
+
+Or using callable objects: +

+

+     class linear:
+        def __init__(self, a, b):
+            self.a, self.b = a,b
+        def __call__(self, x):
+            return self.a * x + self.b
+
+In both cases: +

+

+     taxes = linear(0.3,2)
+
+gives a callable object where taxes(10e6) == 0.3 * 10e6 + 2. +

+The defaults strategy has the disadvantage that the default arguments +could be accidentally or maliciously overridden. The callable objects +approach has the disadvantage that it is a bit slower and a bit +longer. Note however that a collection of callables can share +their signature via inheritance. EG +

+

+      class exponential(linear):
+         # __init__ inherited
+         def __call__(self, x):
+             return self.a * (x ** self.b)
+
+On comp.lang.python, zenin@bawdycaste.org points out that +an object can encapsulate state for several methods in order +to emulate the "closure" concept from functional programming +languages, for example: +

+

+    class counter:
+        value = 0
+        def set(self, x): self.value = x
+        def up(self): self.value=self.value+1
+        def down(self): self.value=self.value-1
+
+
+    count = counter()
+    inc, dec, reset = count.up, count.down, count.set
+
+Here inc, dec and reset act like "functions which share the +same closure containing the variable count.value" (if you +like that way of thinking). +

+ +Edit this entry / +Log info + +/ Last changed on Fri Sep 25 08:38:35 1998 by +Aaron Watters +

+ +


+

6.28. Why do I get a SyntaxError for a 'continue' inside a 'try'?

+This is an implementation limitation, +caused by the extremely simple-minded +way Python generates bytecode. The try block pushes something on the +"block stack" which the continue would have to pop off again. The +current code generator doesn't have the data structures around so that +'continue' can generate the right code. +

+Note that JPython doesn't have this restriction! +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 22 15:01:07 1998 by +GvR +

+ +


+

6.29. Why can't raw strings (r-strings) end with a backslash?

+More precisely, they can't end with an odd number of backslashes: +the unpaired backslash at the end escapes the closing quote character, +leaving an unterminated string. +

+Raw strings were designed to ease creating input for processors (chiefly +regular expression engines) that want to do their own backslash escape processing. Such processors consider an unmatched trailing backslash to be an error anyway, so raw strings disallow that. In return, they allow you to pass on the string quote character by escaping it with a backslash. These rules work well when r-strings are used for their intended purpose. +

+If you're trying to build Windows pathnames, note that all Windows system calls accept forward slashes too: +

+

+    f = open("/mydir/file.txt") # works fine!
+
+If you're trying to build a pathname for a DOS command, try e.g. one of +

+

+    dir = r"\this\is\my\dos\dir" "\\"
+    dir = r"\this\is\my\dos\dir\ "[:-1]
+    dir = "\\this\\is\\my\\dos\\dir\\"
+
+

+ +Edit this entry / +Log info + +/ Last changed on Mon Jul 13 20:50:20 1998 by +Tim Peters +

+ +


+

6.30. Why can't I use an assignment in an expression?

+Many people used to C or Perl complain that they want to be able to +use e.g. this C idiom: +

+

+    while (line = readline(f)) {
+        ...do something with line...
+    }
+
+where in Python you're forced to write this: +

+

+    while 1:
+        line = f.readline()
+        if not line:
+            break
+        ...do something with line...
+
+This issue comes up in the Python newsgroup with alarming frequency +-- search Deja News for past messages about assignment expression. +The reason for not allowing assignment in Python expressions +is a common, hard-to-find bug in those other languages, +caused by this construct: +

+

+    if (x = 0) {
+        ...error handling...
+    }
+    else {
+        ...code that only works for nonzero x...
+    }
+
+Many alternatives have been proposed. Most are hacks that save some +typing but use arbitrary or cryptic syntax or keywords, +and fail the simple criterion that I use for language change proposals: +it should intuitively suggest the proper meaning to a human reader +who has not yet been introduced with the construct. +

+The earliest time something can be done about this will be with +Python 2.0 -- if it is decided that it is worth fixing. +An interesting phenomenon is that most experienced Python programmers +recognize the "while 1" idiom and don't seem to be missing the +assignment in expression construct much; it's only the newcomers +who express a strong desire to add this to the language. +

+One fairly elegant solution would be to introduce a new operator +for assignment in expressions spelled ":=" -- this avoids the "=" +instead of "==" problem. It would have the same precedence +as comparison operators but the parser would flag combination with +other comparisons (without disambiguating parentheses) as an error. +

+Finally -- there's an alternative way of spelling this that seems +attractive but is generally less robust than the "while 1" solution: +

+

+    line = f.readline()
+    while line:
+        ...do something with line...
+        line = f.readline()
+
+The problem with this is that if you change your mind about exactly +how you get the next line (e.g. you want to change it into +sys.stdin.readline()) you have to remember to change two places +in your program -- the second one hidden at the bottom of the loop. +

+ +Edit this entry / +Log info + +/ Last changed on Tue May 18 00:57:41 1999 by +Andrew Dalke +

+ +


+

6.31. Why doesn't Python have a "with" statement like some other languages?

+Basically, because such a construct would be terribly ambiguous. Thanks to Carlos Ribeiro for the following remarks: +

+Some languages, such as Object Pascal, Delphi, and C++, use static types. So it is possible to know, in an unambiguous way, what member is being assigned in a "with" clause. This is the main point - the compiler always knows the scope of every variable at compile time. +

+Python uses dynamic types. It is impossible to know in advance which +attribute will be referenced at runtime. Member attributes may be added or removed from objects on the fly. This would make it impossible to know, from a simple reading, what attribute is being referenced - a local one, a global one, or a member attribute. +

+For instance, take the following snippet (it is incomplete btw, just to +give you the idea): +

+

+   def with_is_broken(a):
+      with a:
+         print x
+
+The snippet assumes that "a" must have a member attribute called "x". +However, there is nothing in Python that guarantees that. What should +happen if "a" is, let us say, an integer? And if I have a global variable named "x", will it end up being used inside the with block? As you see, the dynamic nature of Python makes such choices much harder. +

+The primary benefit of "with" and similar language features (reduction of code volume) can, however, easily be achieved in Python by assignment. Instead of: +

+

+    function(args).dict[index][index].a = 21
+    function(args).dict[index][index].b = 42
+    function(args).dict[index][index].c = 63
+
+would become: +

+

+    ref = function(args).dict[index][index]
+    ref.a = 21
+    ref.b = 42
+    ref.c = 63
+
+This also has the happy side-effect of increasing execution speed, since name bindings are resolved at run-time in Python, and the second method only needs to perform the resolution once. If the referenced object does not have a, b and c attributes, of course, the end result is still a run-time exception. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jan 11 14:32:58 2002 by +Steve Holden +

+ +


+

6.32. Why are colons required for if/while/def/class?

+The colon is required primarily to enhance readability (one of the +results of the experimental ABC language). Consider this: +

+

+    if a==b
+        print a
+
+versus +

+

+    if a==b:
+        print a
+
+Notice how the second one is slightly easier to read. Notice further how +a colon sets off the example in the second line of this FAQ answer; it's +a standard usage in English. Finally, the colon makes it easier for +editors with syntax highlighting. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Jun 3 07:22:57 2002 by +Matthias Urlichs +

+ +


+

6.33. Can't we get rid of the Global Interpreter Lock?

+The Global Interpreter Lock (GIL) is often seen as a hindrance to +Python's deployment on high-end multiprocessor server machines, +because a multi-threaded Python program effectively only uses +one CPU, due to the insistence that (almost) all Python code +can only run while the GIL is held. +

+Back in the days of Python 1.5, Greg Stein actually implemented +a comprehensive patch set ("free threading") +that removed the GIL, replacing it with +fine-grained locking. Unfortunately, even on Windows (where locks +are very efficient) this ran ordinary Python code about twice as +slow as the interpreter using the GIL. On Linux the performance +loss was even worse (pthread locks aren't as efficient). +

+Since then, the idea of getting rid of the GIL has occasionally +come up but nobody has found a way to deal with the expected slowdown; +Greg's free threading patch set has not been kept up-to-date for +later Python versions. +

+This doesn't mean that you can't make good use of Python on +multi-CPU machines! You just have to be creative with dividing +the work up between multiple processes rather than multiple +threads. +

+

+It has been suggested that the GIL should be a per-interpreter-state +lock rather than truly global; interpreters then wouldn't be able +to share objects. Unfortunately, this isn't likely to happen either. +

+It would be a tremendous amount of work, because many object +implementations currently have global state. E.g. small ints and +small strings are cached; these caches would have to be moved to the +interpreter state. Other object types have their own free list; these +free lists would have to be moved to the interpreter state. And so +on. +

+And I doubt that it can even be done in finite time, because the same +problem exists for 3rd party extensions. It is likely that 3rd party +extensions are being written at a faster rate than you can convert +them to store all their global state in the interpreter state. +

+And finally, once you have multiple interpreters not sharing any +state, what have you gained over running each interpreter +in a separate process? +

+ +Edit this entry / +Log info + +/ Last changed on Fri Feb 7 16:34:01 2003 by +GvR +

+ +


+

7. Using Python on non-UNIX platforms

+ +
+

7.1. Is there a Mac version of Python?

+Yes, it is maintained by Jack Jansen. See Jack's MacPython Page: +

+

+  http://www.cwi.nl/~jack/macpython.html
+
+

+ +Edit this entry / +Log info + +/ Last changed on Fri May 4 09:33:42 2001 by +GvR +

+ +


+

7.2. Are there DOS and Windows versions of Python?

+Yes. The core windows binaries are available from http://www.python.org/windows/. There is a plethora of Windows extensions available, including a large number of not-always-compatible GUI toolkits. The core binaries include the standard Tkinter GUI extension. +

+Most windows extensions can be found (or referenced) at http://www.python.org/windows/ +

+Windows 3.1/DOS support seems to have dropped off recently. You may need to settle for an old version of Python one these platforms. One such port is WPY +

+WPY: Ports to DOS, Windows 3.1(1), Windows 95, Windows NT and OS/2. +Also contains a GUI package that offers portability between Windows +(not DOS) and Unix, and native look and feel on both. +ftp://ftp.python.org/pub/python/wpy/. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Jun 2 20:21:57 1998 by +Mark Hammond +

+ +


+

7.3. Is there an OS/2 version of Python?

+Yes, see http://www.python.org/download/download_os2.html. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Sep 7 11:33:16 1999 by +GvR +

+ +


+

7.4. Is there a VMS version of Python?

+Jean-François Piéronne has ported 2.1.3 to OpenVMS. It can be found at +<http://vmspython.dyndns.org/>. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Sep 19 15:40:38 2002 by +Skip Montanaro +

+ +


+

7.5. What about IBM mainframes, or other non-UNIX platforms?

+I haven't heard about these, except I remember hearing about an +OS/9 port and a port to Vxworks (both operating systems for embedded +systems). If you're interested in any of this, go directly to the +newsgroup and ask there, you may find exactly what you need. For +example, a port to MPE/iX 5.0 on HP3000 computers was just announced, +see http://www.allegro.com/software/. +

+On the IBM mainframe side, for Z/OS there's a port of python 1.4 that goes with their open-unix package, formely OpenEdition MVS, (http://www-1.ibm.com/servers/eserver/zseries/zos/unix/python.html). On a side note, there's also a java vm ported - so, in theory, jython could run too. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Nov 18 03:18:39 2002 by +Bruno Jessen +

+ +


+

7.6. Where are the source or Makefiles for the non-UNIX versions?

+The standard sources can (almost) be used. Additional sources can +be found in the platform-specific subdirectories of the distribution. +

+ +Edit this entry / +Log info +

+ +


+

7.7. What is the status and support for the non-UNIX versions?

+I don't have access to most of these platforms, so in general I am +dependent on material submitted by volunteers. However I strive to +integrate all changes needed to get it to compile on a particular +platform back into the standard sources, so porting of the next +version to the various non-UNIX platforms should be easy. +(Note that Linux is classified as a UNIX platform here. :-) +

+Some specific platforms: +

+Windows: all versions (95, 98, ME, NT, 2000, XP) are supported, +all python.org releases come with a Windows installer. +

+MacOS: Jack Jansen does an admirable job of keeping the Mac version +up to date (both MacOS X and older versions); +see http://www.cwi.nl/~jack/macpython.html +

+For all supported platforms, see http://www.python.org/download/ +(follow the link to "Other platforms" for less common platforms) +

+ +Edit this entry / +Log info + +/ Last changed on Fri May 24 21:34:24 2002 by +GvR +

+ +


+

7.8. I have a PC version but it appears to be only a binary. Where's the library?

+If you are running any version of Windows, then you have the wrong distribution. The FAQ lists current Windows versions. Notably, Pythonwin and wpy provide fully functional installations. +

+But if you are sure you have the only distribution with a hope of working on +your system, then... +

+You still need to copy the files from the distribution directory +"python/Lib" to your system. If you don't have the full distribution, +you can get the file lib<version>.tar.gz from most ftp sites carrying +Python; this is a subset of the distribution containing just those +files, e.g. ftp://ftp.python.org/pub/python/src/lib1.4.tar.gz. +

+Once you have installed the library, you need to point sys.path to it. +Assuming the library is in C:\misc\python\lib, the following commands +will point your Python interpreter to it (note the doubled backslashes +-- you can also use single forward slashes instead): +

+

+        >>> import sys
+        >>> sys.path.insert(0, 'C:\\misc\\python\\lib')
+        >>>
+
+For a more permanent effect, set the environment variable PYTHONPATH, +as follows (talking to a DOS prompt): +

+

+        C> SET PYTHONPATH=C:\misc\python\lib
+
+

+ +Edit this entry / +Log info + +/ Last changed on Fri May 23 16:28:27 1997 by +Ken Manheimer +

+ +


+

7.9. Where's the documentation for the Mac or PC version?

+The documentation for the Unix version also applies to the Mac and +PC versions. Where applicable, differences are indicated in the text. +

+ +Edit this entry / +Log info +

+ +


+

7.10. How do I create a Python program file on the Mac or PC?

+Use an external editor. On the Mac, BBEdit seems to be a popular +no-frills text editor. I work like this: start the interpreter; edit +a module file using BBedit; import and test it in the interpreter; +edit again in BBedit; then use the built-in function reload() to +re-read the imported module; etc. In the 1.4 distribution +you will find a BBEdit extension that makes life a little easier: +it can tell the interpreter to execute the current window. +See :Mac:Tools:BBPy:README. +

+Regarding the same question for the PC, Kurt Wm. Hemr writes: "While +anyone with a pulse could certainly figure out how to do the same on +MS-Windows, I would recommend the NotGNU Emacs clone for MS-Windows. +Not only can you easily resave and "reload()" from Python after making +changes, but since WinNot auto-copies to the clipboard any text you +select, you can simply select the entire procedure (function) which +you changed in WinNot, switch to QWPython, and shift-ins to reenter +the changed program unit." +

+If you're using Windows95 or Windows NT, you should also know about +PythonWin, which provides a GUI framework, with an mouse-driven +editor, an object browser, and a GUI-based debugger. See +

+       http://www.python.org/ftp/python/pythonwin/
+
+for details. +

+ +Edit this entry / +Log info + +/ Last changed on Sun May 25 10:04:25 1997 by +GvR +

+ +


+

7.11. How can I use Tkinter on Windows 95/NT?

+Starting from Python 1.5, it's very easy -- just download and install +Python and Tcl/Tk and you're in business. See +

+

+  http://www.python.org/download/download_windows.html
+
+One warning: don't attempt to use Tkinter from PythonWin +(Mark Hammond's IDE). Use it from the command line interface +(python.exe) or the windowless interpreter (pythonw.exe). +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jun 12 09:32:48 1998 by +GvR +

+ +


+

7.12. cgi.py (or other CGI programming) doesn't work sometimes on NT or win95!

+Be sure you have the latest python.exe, that you are using +python.exe rather than a GUI version of python and that you +have configured the server to execute +

+

+     "...\python.exe -u ..."
+
+for the cgi execution. The -u (unbuffered) option on NT and +win95 prevents the interpreter from altering newlines in the +standard input and output. Without it post/multipart requests +will seem to have the wrong length and binary (eg, GIF) +responses may get garbled (resulting in, eg, a "broken image"). +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jul 30 10:48:02 1997 by +aaron watters +

+ +


+

7.13. Why doesn't os.popen() work in PythonWin on NT?

+The reason that os.popen() doesn't work from within PythonWin is due to a bug in Microsoft's C Runtime Library (CRT). The CRT assumes you have a Win32 console attached to the process. +

+You should use the win32pipe module's popen() instead which doesn't depend on having an attached Win32 console. +

+Example: +

+ import win32pipe
+ f = win32pipe.popen('dir /c c:\\')
+ print f.readlines()
+ f.close()
+
+

+ +Edit this entry / +Log info + +/ Last changed on Thu Jul 31 15:34:09 1997 by +Bill Tutt +

+ +


+

7.14. How do I use different functionality on different platforms with the same program?

+Remember that Python is extremely dynamic and that you +can use this dynamism to configure a program at run-time to +use available functionality on different platforms. For example +you can test the sys.platform and import different modules based +on its value. +

+

+   import sys
+   if sys.platform == "win32":
+      import win32pipe
+      popen = win32pipe.popen
+   else:
+      import os
+      popen = os.popen
+
+(See FAQ 7.13 for an explanation of why you might want to +do something like this.) Also you can try to import a module +and use a fallback if the import fails: +

+

+    try:
+         import really_fast_implementation
+         choice = really_fast_implementation
+    except ImportError:
+         import slower_implementation
+         choice = slower_implementation
+
+

+ +Edit this entry / +Log info + +/ Last changed on Wed Aug 13 07:39:06 1997 by +aaron watters +

+ +


+

7.15. Is there an Amiga version of Python?

+Yes. See the AmigaPython homepage at http://www.bigfoot.com/~irmen/python.html. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Dec 14 06:53:32 1998 by +Irmen de Jong +

+ +


+

7.16. Why doesn't os.popen()/win32pipe.popen() work on Win9x?

+There is a bug in Win9x that prevents os.popen/win32pipe.popen* from working. The good news is there is a way to work around this problem. +The Microsoft Knowledge Base article that you need to lookup is: Q150956. You will find links to the knowledge base at: +http://www.microsoft.com/kb. +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jun 25 10:45:38 1999 by +Bill Tutt +

+ +


+

8. Python on Windows

+ +
+

8.1. Using Python for CGI on Microsoft Windows

+** Setting up the Microsoft IIS Server/Peer Server +

+On the Microsoft IIS +server or on the Win95 MS Personal Web Server +you set up python in the same way that you +would set up any other scripting engine. +

+Run regedt32 and go to: +

+HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\W3SVC\Parameters\ScriptMap +

+and enter the following line (making any specific changes that your system may need) +

+.py :REG_SZ: c:\<path to python>\python.exe -u %s %s +

+This line will allow you to call your script with a simple reference like: +http://yourserver/scripts/yourscript.py +provided "scripts" is an "executable" directory for your server (which +it usually is by default). +The "-u" flag specifies unbuffered and binary mode for stdin - needed when working with binary data +

+In addition, it is recommended by people who would know that using ".py" may +not be a good idea for the file extensions when used in this context +(you might want to reserve *.py for support modules and use *.cgi or *.cgp +for "main program" scripts). +However, that issue is beyond this Windows FAQ entry. +

+

+** Apache configuration +

+In the Apache configuration file httpd.conf, add the following line at +the end of the file: +

+ScriptInterpreterSource Registry +

+Then, give your Python CGI-scripts the extension .py and put them in the cgi-bin directory. +

+

+** Netscape Servers: +Information on this topic exists at: +http://home.netscape.com/comprod/server_central/support/fasttrack_man/programs.htm#1010870 +

+ +Edit this entry / +Log info + +/ Last changed on Wed Mar 27 12:25:54 2002 by +Gerhard Häring +

+ +


+

8.2. How to check for a keypress without blocking?

+Use the msvcrt module. This is a standard Windows-specific extensions +in Python 1.5 and beyond. It defines a function kbhit() which checks +whether a keyboard hit is present; also getch() which gets one +character without echo. Plus a few other goodies. +

+(Search for "keypress" to find an answer for Unix as well.) +

+ +Edit this entry / +Log info + +/ Last changed on Mon Mar 30 16:21:46 1998 by +GvR +

+ +


+

8.3. $PYTHONPATH

+In MS-DOS derived environments, a unix variable such as $PYTHONPATH is +set as PYTHONPATH, without the dollar sign. PYTHONPATH is useful for +specifying the location of library files. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jun 11 00:41:26 1998 by +Gvr +

+ +


+

8.4. dedent syntax errors

+The FAQ does not recommend using tabs, and Guido's Python Style Guide recommends 4 spaces for distributed Python code; this is also the Emacs python-mode default; see +

+

+    http://www.python.org/doc/essays/styleguide.html
+
+Under any editor mixing tabs and spaces is a bad idea. MSVC is no different in this respect, and is easily configured to use spaces: Take Tools -> Options -> Tabs, and for file type "Default" set "Tab size" and "Indent size" to 4, and select the "Insert spaces" radio button. +

+If you suspect mixed tabs and spaces are causing problems in leading whitespace, run Python with the -t switch or, run Tools/Scripts/tabnanny.py to check a directory tree in batch mode. +

+ +Edit this entry / +Log info + +/ Last changed on Mon Feb 12 15:04:14 2001 by +Steve Holden +

+ +


+

8.5. How do I emulate os.kill() in Windows?

+Use win32api: +

+

+    def kill(pid):
+        """kill function for Win32"""
+        import win32api
+        handle = win32api.OpenProcess(1, 0, pid)
+        return (0 != win32api.TerminateProcess(handle, 0))
+
+

+ +Edit this entry / +Log info + +/ Last changed on Sat Aug 8 18:55:06 1998 by +Jeff Bauer +

+ +


+

8.6. Why does os.path.isdir() fail on NT shared directories?

+The solution appears to be always append the "\\" on +the end of shared drives. +

+

+  >>> import os
+  >>> os.path.isdir( '\\\\rorschach\\public')
+  0
+  >>> os.path.isdir( '\\\\rorschach\\public\\')
+  1
+
+[Blake Winton responds:] +I've had the same problem doing "Start >> Run" and then a +directory on a shared drive. If I use "\\rorschach\public", +it will fail, but if I use "\\rorschach\public\", it will +work. For that matter, os.stat() does the same thing (well, +it gives an error for "\\\\rorschach\\public", but you get +the idea)... +

+I've got a theory about why this happens, but it's only +a theory. NT knows the difference between shared directories, +and regular directories. "\\rorschach\public" isn't a +directory, it's _really_ an IPC abstraction. This is sort +of lended credence to by the fact that when you're mapping +a network drive, you can't map "\\rorschach\public\utils", +but only "\\rorschach\public". +

+[Clarification by funkster@midwinter.com] +It's not actually a Python +question, as Python is working just fine; it's clearing up something +a bit muddled about Windows networked drives. +

+It helps to think of share points as being like drive letters. +Example: +

+        k: is not a directory
+        k:\ is a directory
+        k:\media is a directory
+        k:\media\ is not a directory
+
+The same rules apply if you substitute "k:" with "\\conky\foo": +
+        \\conky\foo  is not a directory
+        \\conky\foo\ is a directory
+        \\conky\foo\media is a directory
+        \\conky\foo\media\ is not a directory
+
+

+ +Edit this entry / +Log info + +/ Last changed on Sun Jan 31 08:44:48 1999 by +GvR +

+ +


+

8.7. PyRun_SimpleFile() crashes on Windows but not on Unix

+I've seen a number of reports of PyRun_SimpleFile() failing +in a Windows port of an application embedding Python that worked +fine on Unix. PyRun_SimpleString() works fine on both platforms. +

+I think this happens because the application was compiled with a +different set of compiler flags than Python15.DLL. It seems that some +compiler flags affect the standard I/O library in such a way that +using different flags makes calls fail. You need to set it for +the non-debug multi-threaded DLL (/MD on the command line, or can be set via MSVC under Project Settings->C++/Code Generation then the "Use rum-time library" dropdown.) +

+Also note that you can not mix-and-match Debug and Release versions. If you wish to use the Debug Multithreaded DLL, then your module _must_ have an "_d" appended to the base name. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Nov 17 17:37:07 1999 by +Mark Hammond +

+ +


+

8.8. Import of _tkinter fails on Windows 95/98

+Sometimes, the import of _tkinter fails on Windows 95 or 98, +complaining with a message like the following: +

+

+  ImportError: DLL load failed: One of the library files needed
+  to run this application cannot be found.
+
+It could be that you haven't installed Tcl/Tk, but if you did +install Tcl/Tk, and the Wish application works correctly, +the problem may be that its installer didn't +manage to edit the autoexec.bat file correctly. It tries to add a +statement that changes the PATH environment variable to include +the Tcl/Tk 'bin' subdirectory, but sometimes this edit doesn't +quite work. Opening it with notepad usually reveals what the +problem is. +

+(One additional hint, noted by David Szafranski: you can't use +long filenames here; e.g. use C:\PROGRA~1\Tcl\bin instead of +C:\Program Files\Tcl\bin.) +

+ +Edit this entry / +Log info + +/ Last changed on Wed Dec 2 22:32:41 1998 by +GvR +

+ +


+

8.9. Can't extract the downloaded documentation on Windows

+Sometimes, when you download the documentation package to a Windows +machine using a web browser, the file extension of the saved file +ends up being .EXE. This is a mistake; the extension should be .TGZ. +

+Simply rename the downloaded file to have the .TGZ extension, and +WinZip will be able to handle it. (If your copy of WinZip doesn't, +get a newer one from http://www.winzip.com.) +

+ +Edit this entry / +Log info + +/ Last changed on Sat Nov 21 13:41:35 1998 by +GvR +

+ +


+

8.10. Can't get Py_RunSimpleFile() to work.

+This is very sensitive to the compiler vendor, version and (perhaps) +even options. If the FILE* structure in your embedding program isn't +the same as is assumed by the Python interpreter it won't work. +

+The Python 1.5.* DLLs (python15.dll) are all compiled +with MS VC++ 5.0 and with multithreading-DLL options (/MD, I think). +

+If you can't change compilers or flags, try using Py_RunSimpleString(). +A trick to get it to run an arbitrary file is to construct a call to +execfile() with the name of your file as argument. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Jan 13 10:58:14 1999 by +GvR +

+ +


+

8.11. Where is Freeze for Windows?

+("Freeze" is a program that allows you to ship a Python program +as a single stand-alone executable file. It is not a compiler, +your programs don't run any faster, but they are more easily +distributable (to platforms with the same OS and CPU). Read the +README file of the freeze program for more disclaimers.) +

+You can use freeze on Windows, but you must download the source +tree (see http://www.python.org/download/download_source.html). +This is recommended for Python 1.5.2 (and betas thereof) only; +older versions don't quite work. +

+You need the Microsoft VC++ 5.0 compiler (maybe it works with +6.0 too). You probably need to build Python -- the project files +are all in the PCbuild directory. +

+The freeze program is in the Tools\freeze subdirectory of the source +tree. +

+ +Edit this entry / +Log info + +/ Last changed on Wed Feb 17 18:47:24 1999 by +GvR +

+ +


+

8.12. Is a *.pyd file the same as a DLL?

+Yes, .pyd files are dll's. But there are a few differences. If you +have a DLL named foo.pyd, then it must have a function initfoo(). You +can then write Python "import foo", and Python will search for foo.pyd +(as well as foo.py, foo.pyc) and if it finds it, will attempt to call +initfoo() to initialize it. You do not link your .exe with foo.lib, +as that would cause Windows to require the DLL to be present. +

+Note that the search path for foo.pyd is PYTHONPATH, not the same as +the path that Windows uses to search for foo.dll. Also, foo.pyd need +not be present to run your program, whereas if you linked your program +with a dll, the dll is required. Of course, foo.pyd is required if +you want to say "import foo". In a dll, linkage is declared in the +source code with __declspec(dllexport). In a .pyd, linkage is defined +in a list of available functions. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Nov 23 02:40:08 1999 by +Jameson Quinn +

+ +


+

8.13. Missing cw3215mt.dll (or missing cw3215.dll)

+Sometimes, when using Tkinter on Windows, you get an error that +cw3215mt.dll or cw3215.dll is missing. +

+Cause: you have an old Tcl/Tk DLL built with cygwin in your path +(probably C:\Windows). You must use the Tcl/Tk DLLs from the +standard Tcl/Tk installation (Python 1.5.2 comes with one). +

+ +Edit this entry / +Log info + +/ Last changed on Fri Jun 11 00:54:13 1999 by +GvR +

+ +


+

8.14. How to make python scripts executable:

+[Blake Coverett] +

+Win2K: +

+The standard installer already associates the .py extension with a file type +(Python.File) and gives that file type an open command that runs the +interpreter (D:\Program Files\Python\python.exe "%1" %*). This is enough to +make scripts executable from the command prompt as 'foo.py'. If you'd +rather be able to execute the script by simple typing 'foo' with no +extension you need to add .py to the PATHEXT environment variable. +

+WinNT: +

+The steps taken by the installed as described above allow you do run a +script with 'foo.py', but a long time bug in the NT command processor +prevents you from redirecting the input or output of any script executed in +this way. This is often important. +

+An appropriate incantation for making a Python script executable under WinNT +is to give the file an extension of .cmd and add the following as the first +line: +

+

+    @setlocal enableextensions & python -x %~f0 %* & goto :EOF
+
+Win9x: +

+[Due to Bruce Eckel] +

+

+  @echo off
+  rem = """
+  rem run python on this bat file. Needs the full path where
+  rem you keep your python files. The -x causes python to skip
+  rem the first line of the file:
+  python -x c:\aaa\Python\\"%0".bat %1 %2 %3 %4 %5 %6 %7 %8 %9
+  goto endofpython
+  rem """
+
+
+  # The python program goes here:
+
+
+  print "hello, Python"
+
+
+  # For the end of the batch file:
+  rem = """
+  :endofpython
+  rem """
+
+

+ +Edit this entry / +Log info + +/ Last changed on Tue Nov 30 10:25:17 1999 by +GvR +

+ +


+

8.15. Warning about CTL3D32 version from installer

+The Python installer issues a warning like this: +

+

+  This version uses CTL3D32.DLL whitch is not the correct version.
+  This version is used for windows NT applications only.
+
+[Tim Peters] +This is a Microsoft DLL, and a notorious +source of problems. The msg means what it says: you have the wrong version +of this DLL for your operating system. The Python installation did not +cause this -- something else you installed previous to this overwrote the +DLL that came with your OS (probably older shareware of some sort, but +there's no way to tell now). If you search for "CTL3D32" using any search +engine (AltaVista, for example), you'll find hundreds and hundreds of web +pages complaining about the same problem with all sorts of installation +programs. They'll point you to ways to get the correct version reinstalled +on your system (since Python doesn't cause this, we can't fix it). +

+David A Burton has written a little program to fix this. Go to +http://www.burtonsys.com/download.html and click on "ctl3dfix.zip" +

+ +Edit this entry / +Log info + +/ Last changed on Thu Oct 26 15:42:00 2000 by +GvR +

+ +


+

8.16. How can I embed Python into a Windows application?

+Edward K. Ream <edream@tds.net> writes +

+When '##' appears in a file name below, it is an abbreviated version number. For example, for Python 2.1.1, ## will be replaced by 21. +

+Embedding the Python interpreter in a Windows app can be summarized as +follows: +

+1. Do _not_ build Python into your .exe file directly. On Windows, +Python must be a DLL to handle importing modules that are themselves +DLL's. (This is the first key undocumented fact.) Instead, link to +python##.dll; it is typically installed in c:\Windows\System. +

+You can link to Python statically or dynamically. Linking statically +means linking against python##.lib The drawback is that your app won't +run if python##.dll does not exist on your system. +

+General note: python##.lib is the so-called "import lib" corresponding +to python.dll. It merely defines symbols for the linker. +

+Borland note: convert python##.lib to OMF format using Coff2Omf.exe +first. +

+Linking dynamically greatly simplifies link options; everything happens +at run time. Your code must load python##.dll using the Windows +LoadLibraryEx() routine. The code must also use access routines and +data in python##.dll (that is, Python's C API's) using pointers +obtained by the Windows GetProcAddress() routine. Macros can make +using these pointers transparent to any C code that calls routines in +Python's C API. +

+2. If you use SWIG, it is easy to create a Python "extension module" +that will make the app's data and methods available to Python. SWIG +will handle just about all the grungy details for you. The result is C +code that you link _into your .exe file_ (!) You do _not_ have to +create a DLL file, and this also simplifies linking. +

+3. SWIG will create an init function (a C function) whose name depends +on the name of the extension module. For example, if the name of the +module is leo, the init function will be called initleo(). If you use +SWIG shadow classes, as you should, the init function will be called +initleoc(). This initializes a mostly hidden helper class used by the +shadow class. +

+The reason you can link the C code in step 2 into your .exe file is that +calling the initialization function is equivalent to importing the +module into Python! (This is the second key undocumented fact.) +

+4. In short, you can use the following code to initialize the Python +interpreter with your extension module. +

+

+    #include "python.h"
+    ...
+    Py_Initialize();  // Initialize Python.
+    initmyAppc();  // Initialize (import) the helper class. 
+    PyRun_SimpleString("import myApp") ;  // Import the shadow class.
+
+5. There are two problems with Python's C API which will become apparent +if you use a compiler other than MSVC, the compiler used to build +python##.dll. +

+Problem 1: The so-called "Very High Level" functions that take FILE * +arguments will not work in a multi-compiler environment; each compiler's +notion of a struct FILE will be different. From an implementation +standpoint these are very _low_ level functions. +

+Problem 2: SWIG generates the following code when generating wrappers to +void functions: +

+

+    Py_INCREF(Py_None);
+    _resultobj = Py_None;
+    return _resultobj;
+
+Alas, Py_None is a macro that expands to a reference to a complex data +structure called _Py_NoneStruct inside python##.dll. Again, this code +will fail in a mult-compiler environment. Replace such code by: +

+

+    return Py_BuildValue("");
+
+It may be possible to use SWIG's %typemap command to make the change +automatically, though I have not been able to get this to work (I'm a +complete SWIG newbie). +

+6. Using a Python shell script to put up a Python interpreter window +from inside your Windows app is not a good idea; the resulting window +will be independent of your app's windowing system. Rather, you (or the +wxPythonWindow class) should create a "native" interpreter window. It +is easy to connect that window to the Python interpreter. You can +redirect Python's i/o to _any_ object that supports read and write, so +all you need is a Python object (defined in your extension module) that +contains read() and write() methods. +

+ +Edit this entry / +Log info + +/ Last changed on Thu Jan 31 16:29:34 2002 by +Victor Kryukov +

+ +


+

8.17. Setting up IIS 5 to use Python for CGI

+In order to set up Internet Information Services 5 to use Python for CGI processing, please see the following links: +

+http://www.e-coli.net/pyiis_server.html (for Win2k Server) +http://www.e-coli.net/pyiis.html (for Win2k pro) +

+ +Edit this entry / +Log info + +/ Last changed on Fri Mar 22 22:05:51 2002 by +douglas savitsky +

+ +


+

8.18. How do I run a Python program under Windows?

+This is not necessarily quite the straightforward question it appears +to be. If you are already familiar with running programs from the +Windows command line then everything will seem really easy and +obvious. If your computer experience is limited then you might need a +little more guidance. Also there are differences between Windows 95, +98, NT, ME, 2000 and XP which can add to the confusion. You might +think of this as "why I pay software support charges" if you have a +helpful and friendly administrator to help you set things up without +having to understand all this yourself. If so, then great! Show them +this page and it should be a done deal. +

+Unless you use some sort of integrated development environment (such +as PythonWin or IDLE, to name only two in a growing family) then you +will end up typing Windows commands into what is variously referred +to as a "DOS window" or "Command prompt window". Usually you can +create such a window from your Start menu (under Windows 2000 I use +"Start | Programs | Accessories | Command Prompt"). You should be +able to recognize when you have started such a window because you will +see a Windows "command prompt", which usually looks like this: +

+

+    C:\>
+
+The letter may be different, and there might be other things after it, +so you might just as easily see something like: +

+

+    D:\Steve\Projects\Python>
+
+depending on how your computer has been set up and what else you have +recently done with it. Once you have started such a window, you are +well on the way to running Python programs. +

+You need to realize that your Python scripts have to be processed by +another program, usually called the "Python interpreter". The +interpreter reads your script, "compiles" it into "Python bytecodes" +(which are instructions for an imaginary computer known as the "Python +Virtual Machine") and then executes the bytecodes to run your +program. So, how do you arrange for the interpreter to handle your +Python? +

+First, you need to make sure that your command window recognises the +word "python" as an instruction to start the interpreter. If you have +opened a command window, you should try entering the command: +

+

+    python
+
+and hitting return. If you then see something like: +

+

+    Python 2.2 (#28, Dec 21 2001, 12:21:22) [MSC 32 bit (Intel)] on win32
+    Type "help", "copyright", "credits" or "license" for more information.
+    >>>
+
+then this part of the job has been correctly managed during Python's +installation process, and you have started the interpreter in +"interactive mode". That means you can enter Python statements or +expressions interactively and have them executed or evaluated while +you wait. This is one of Python's strongest features, but it takes a +little getting used to. Check it by entering a few expressions of your +choice and seeing the results... +

+

+    >>> print "Hello"
+    Hello
+    >>> "Hello" * 3
+    HelloHelloHello
+
+When you want to end your interactive Python session, enter a +terminator (hold the Ctrl key down while you enter a Z, then hit the +"Enter" key) to get back to your Windows command prompt. You may also +find that you have a Start-menu entry such as "Start | Programs | +Python 2.2 | Python (command line)" that results in you seeing the +">>>" prompt in a new window. If so, the window will disappear after +you enter the terminator -- Windows runs a single "python" command in +the window, which terminates when you terminate the interpreter. +

+If the "python" command, instead of displaying the interpreter prompt ">>>", gives you a message like +

+

+    'python' is not recognized as an internal or external command,
+    operable program or batch file.
+
+or +

+

+    Bad command or filename
+
+then you need to make sure that your computer knows where to find the +Python interpreter. To do this you will have to modify a setting +called the PATH, which is a just list of directories where Windows +will look for programs. Rather than just enter the right command every +time you create a command window, you should arrange for Python's +installation directory to be added to the PATH of every command window +as it starts. If you installed Python fairly recently then the command +

+

+    dir C:\py*
+
+will probably tell you where it is installed. Alternatively, perhaps +you made a note. Otherwise you will be reduced to a search of your +whole disk ... break out the Windows explorer and use "Tools | Find" +or hit the "Search" button and look for "python.exe". Suppose you +discover that Python is installed in the C:\Python22 directory (the +default at the time of writing) then you should make sure that +entering the command +

+

+    c:\Python22\python
+
+starts up the interpreter as above (and don't forget you'll need a +"CTRL-Z" and an "Enter" to get out of it). Once you have verified the +directory, you need to add it to the start-up routines your computer +goes through. For older versions of Windows the easiest way to do +this is to edit the C:\AUTOEXEC.BAT file. You would want to add a line +like the following to AUTOEXEC.BAT: +

+

+    PATH C:\Python22;%PATH%
+
+For Windows NT, 2000 and (I assume) XP, you will need to add a string +such as +

+

+    ;C:\Python22
+
+to the current setting for the PATH environment variable, which you +will find in the properties window of "My Computer" under the +"Advanced" tab. Note that if you have sufficient privilege you might +get a choice of installing the settings either for the Current User or +for System. The latter is preferred if you want everybody to be able +to run Python on the machine. +

+If you aren't confident doing any of these manipulations yourself, ask +for help! At this stage you may or may not want to reboot your system +to make absolutely sure the new setting has "taken" (don't you love +the way Windows gives you these freqeuent coffee breaks). You probably +won't need to for Windows NT, XP or 2000. You can also avoid it in +earlier versions by editing the file C:\WINDOWS\COMMAND\CMDINIT.BAT +instead of AUTOEXEC.BAT. +

+You should now be able to start a new command window, enter +

+

+    python
+
+at the "C:>" (or whatever) prompt, and see the ">>>" prompt that +indicates the Python interpreter is reading interactive commands. +

+Let's suppose you have a program called "pytest.py" in directory +"C:\Steve\Projects\Python". A session to run that program might look +like this: +

+

+    C:\> cd \Steve\Projects\Python
+    C:\Steve\Projects\Python> python pytest.py
+
+Because you added a file name to the command to start the interpreter, +when it starts up it reads the Python script in the named file, +compiles it, executes it, and terminates (so you see another "C:\>" +prompt). You might also have entered +

+

+    C:\> python \Steve\Projects\Python\pytest.py
+
+if you hadn't wanted to change your current directory. +

+Under NT, 2000 and XP you may well find that the installation process +has also arranged that the command +

+

+    pytest.py
+
+(or, if the file isn't in the current directory) +

+

+    C:\Steve\Projects\Python\pytest.py
+
+will automatically recognize the ".py" extension and run the Python +interpreter on the named file. Using this feature is fine, but some +versions of Windows have bugs which mean that this form isn't exactly +equivalent to using the interpreter explicitly, so be careful. Easier +to remember, for now, that +

+

+    python C:\Steve\Projects\Python\pytest.py
+
+works pretty close to the same, and redirection will work (more) +reliably. +

+The important things to remember are: +

+1. Start Python from the Start Menu, or make sure the PATH is set +correctly so Windows can find the Python interpreter. +

+

+    python
+
+should give you a '>>>" prompt from the Python interpreter. Don't +forget the CTRL-Z and ENTER to terminate the interpreter (and, if you +started the window from the Start Menu, make the window disappear). +

+2. Once this works, you run programs with commands: +

+

+    python {program-file}
+
+3. When you know the commands to use you can build Windows shortcuts +to run the Python interpreter on any of your scripts, naming +particular working directories, and adding them to your menus, but +that's another lessFAQ. Take a look at +

+

+    python --help
+
+if your needs are complex. +

+4. Interactive mode (where you see the ">>>" prompt) is best used +not for running programs, which are better executed as in steps 2 +and 3, but for checking that individual statements and expressions do +what you think they will, and for developing code by experiment. +

+ +Edit this entry / +Log info + +/ Last changed on Tue Aug 20 16:19:53 2002 by +GvR +

+ +


+Python home / +Python FAQ Wizard 1.0.3 / +Feedback to GvR +

Python Powered
+ + --- python3.4-3.4.1.orig/debian/PVER-dbg.README.Debian.in +++ python3.4-3.4.1/debian/PVER-dbg.README.Debian.in @@ -0,0 +1,52 @@ +Contents of the @PVER@-dbg package +------------------------------------- + +For debugging python and extension modules, you may want to add the contents +of /usr/share/doc/@PVER@/gdbinit (found in the @PVER@-dev package) to your +~/.gdbinit file. + +@PVER@-dbg contains two sets of packages: + + - debugging symbols for the standard @PVER@ build. When this package + is installed, gdb will automatically load up the debugging symbols + from it when debugging @PVER@ or one of the included extension + modules. + + - a separate @PVER@-dbg binary, configured --with-pydebug, enabling the + additional debugging code to help debug memory management problems. + +For the latter, all extension modules have to be recompiled to +correctly load with an pydebug enabled build. + + +Debian and Ubuntu specific changes to the debug interpreter +----------------------------------------------------------- +The python2.4 and python2.5 packages in Ubuntu feisty are modified to +first look for extension modules under a different name. + + normal build: foo.so + debug build: foo_d.so foo.so + +This naming schema allows installation of the extension modules into +the same path (The naming is directly taken from the Windows builds +which already uses this naming scheme). + +See https://wiki.ubuntu.com/PyDbgBuilds for more information. + + +Using the python-dbg builds +--------------------------- + + * Call the python-dbg or the pythonX.Y-dbg binaries instead of the + python or pythonX.Y binaries. + + * Properties of the debug build are described in + /usr/share/doc/@PVER@/SpecialBuilds.txt.gz. + The debug interpreter is built with Py_DEBUG defined. + + * From SpecialBuilds.txt: This is what is generally meant by "a debug + build" of Python. Py_DEBUG implies LLTRACE, Py_REF_DEBUG, + Py_TRACE_REFS, and PYMALLOC_DEBUG (if WITH_PYMALLOC is enabled). + In addition, C assert()s are enabled (via the C way: by not defining + NDEBUG), and some routines do additional sanity checks inside + "#ifdef Py_DEBUG" blocks. --- python3.4-3.4.1.orig/debian/PVER-dbg.overrides.in +++ python3.4-3.4.1/debian/PVER-dbg.overrides.in @@ -0,0 +1,5 @@ +# just the gdb debug file +@PVER@-dbg binary: python-script-but-no-python-dep + +# pointless lintian ... +@PVER@-dbg binary: hardening-no-fortify-functions --- python3.4-3.4.1.orig/debian/PVER-dbg.postinst.in +++ python3.4-3.4.1/debian/PVER-dbg.postinst.in @@ -0,0 +1,36 @@ +#! /bin/sh + +set -e + +if [ "$1" = configure ]; then + files=$(dpkg -L lib@PVER@-dbg@HOST_QUAL@ | sed -n '/^\/usr\/lib\/@PVER@\/.*\.py$/p') + if [ -n "$files" ]; then + @PVER@ -E -S /usr/lib/@PVER@/py_compile.py $files + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config; then + @PVER@ -E -S -O /usr/lib/@PVER@/py_compile.py $files + fi + else + echo >&2 "@PVER@-dbg: can't get files for byte-compilation" + fi + + if [ -d /usr/include/@PVER@_d ] && [ ! -h /usr/include/@PVER@_d ]; then + if rmdir /usr/include/@PVER@_d 2> /dev/null; then + ln -sf @PVER@dmu /usr/include/@PVER@_d + else + echo >&2 "WARNING: non-empty directory on upgrade: /usr/include/@PVER@_d" + ls -l /usr/include/@PVER@_d + fi + fi + if [ -d /usr/lib/@PVER@/config_d ] && [ ! -h /usr/lib/@PVER@/config_d ]; then + if rmdir /usr/lib/@PVER@/config_d 2> /dev/null; then + ln -sf config-dmu /usr/lib/@PVER@/config_d + else + echo >&2 "WARNING: non-empty directory on upgrade: /usr/lib/@PVER@/config_d" + ls -l /usr/lib/@PVER@/config_d + fi + fi +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER-dbg.prerm.in +++ python3.4-3.4.1/debian/PVER-dbg.prerm.in @@ -0,0 +1,35 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars=$max echo \ + | while read files; do rm -f $files; done + if [ -d /usr/bin/__pycache__ ]; then + rmdir --ignore-fail-on-non-empty /usr/bin/__pycache__ + fi +} + +case "$1" in + remove) + remove_bytecode lib@PVER@-dbg@HOST_QUAL@ + ;; + upgrade) + remove_bytecode lib@PVER@-dbg@HOST_QUAL@ + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/PVER-dev.postinst.in +++ python3.4-3.4.1/debian/PVER-dev.postinst.in @@ -0,0 +1,26 @@ +#! /bin/sh + +set -e + +if [ "$1" = configure ]; then + if [ -d /usr/include/@PVER@ ] && [ ! -h /usr/include/@PVER@ ]; then + if rmdir /usr/include/@PVER@ 2> /dev/null; then + ln -sf @PVER@mu /usr/include/@PVER@ + else + echo >&2 "WARNING: non-empty directory on upgrade: /usr/include/@PVER@" + ls -l /usr/include/@PVER@ + fi + fi + if [ -d /usr/lib/@PVER@/config ] && [ ! -h /usr/lib/@PVER@/config ]; then + if rmdir /usr/lib/@PVER@/config 2> /dev/null; then + ln -sf config-@VER@mu /usr/lib/@PVER@/config + else + echo >&2 "WARNING: non-empty directory on upgrade: /usr/lib/@PVER@/config" + ls -l /usr/lib/@PVER@/config + fi + fi +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-api.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-api.in @@ -0,0 +1,13 @@ +Document: @PVER@-api +Title: Python/C API Reference Manual (v@VER@) +Author: Guido van Rossum +Abstract: This manual documents the API used by C (or C++) programmers who + want to write extension modules or embed Python. It is a + companion to *Extending and Embedding the Python Interpreter*, + which describes the general principles of extension writing but + does not document the API functions in detail. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/c-api/index.html +Files: /usr/share/doc/@PVER@/html/c-api/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-dist.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-dist.in @@ -0,0 +1,13 @@ +Document: @PVER@-dist +Title: Distributing Python Modules (v@VER@) +Author: Greg Ward +Abstract: This document describes the Python Distribution Utilities + (``Distutils'') from the module developer's point-of-view, describing + how to use the Distutils to make Python modules and extensions easily + available to a wider audience with very little overhead for + build/release/install mechanics. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/distutils/index.html +Files: /usr/share/doc/@PVER@/html/distutils/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-ext.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-ext.in @@ -0,0 +1,16 @@ +Document: @PVER@-ext +Title: Extending and Embedding the Python Interpreter (v@VER@) +Author: Guido van Rossum +Abstract: This document describes how to write modules in C or C++ to extend + the Python interpreter with new modules. Those modules can define + new functions but also new object types and their methods. The + document also describes how to embed the Python interpreter in + another application, for use as an extension language. Finally, + it shows how to compile and link extension modules so that they + can be loaded dynamically (at run time) into the interpreter, if + the underlying operating system supports this feature. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/extending/index.html +Files: /usr/share/doc/@PVER@/html/extending/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-inst.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-inst.in @@ -0,0 +1,12 @@ +Document: @PVER@-inst +Title: Installing Python Modules (v@VER@) +Author: Greg Ward +Abstract: This document describes the Python Distribution Utilities + (``Distutils'') from the end-user's point-of-view, describing how to + extend the capabilities of a standard Python installation by building + and installing third-party Python modules and extensions. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/install/index.html +Files: /usr/share/doc/@PVER@/html/install/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-lib.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-lib.in @@ -0,0 +1,15 @@ +Document: @PVER@-lib +Title: Python Library Reference (v@VER@) +Author: Guido van Rossum +Abstract: This library reference manual documents Python's standard library, + as well as many optional library modules (which may or may not be + available, depending on whether the underlying platform supports + them and on the configuration choices made at compile time). It + also documents the standard types of the language and its built-in + functions and exceptions, many of which are not or incompletely + documented in the Reference Manual. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/library/index.html +Files: /usr/share/doc/@PVER@/html/library/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-new.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-new.in @@ -0,0 +1,10 @@ +Document: @PVER@-new +Title: What's new in Python @VER@ +Author: A.M. Kuchling +Abstract: This documents lists new features and changes worth mentioning + in Python @VER@. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/whatsnew/@VER@.html +Files: /usr/share/doc/@PVER@/html/whatsnew/@VER@.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-ref.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-ref.in @@ -0,0 +1,18 @@ +Document: @PVER@-ref +Title: Python Reference Manual (v@VER@) +Author: Guido van Rossum +Abstract: This reference manual describes the syntax and "core semantics" of + the language. It is terse, but attempts to be exact and complete. + The semantics of non-essential built-in object types and of the + built-in functions and modules are described in the *Python + Library Reference*. For an informal introduction to the language, + see the *Python Tutorial*. For C or C++ programmers, two + additional manuals exist: *Extending and Embedding the Python + Interpreter* describes the high-level picture of how to write a + Python extension module, and the *Python/C API Reference Manual* + describes the interfaces available to C/C++ programmers in detail. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/reference/index.html +Files: /usr/share/doc/@PVER@/html/reference/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.doc-base.PVER-tut.in +++ python3.4-3.4.1/debian/PVER-doc.doc-base.PVER-tut.in @@ -0,0 +1,13 @@ +Document: @PVER@-tut +Title: Python Tutorial (v@VER@) +Author: Guido van Rossum, Fred L. Drake, Jr., editor +Abstract: This tutorial introduces the reader informally to the basic + concepts and features of the Python language and system. It helps + to have a Python interpreter handy for hands-on experience, but + all examples are self-contained, so the tutorial can be read + off-line as well. +Section: Programming/Python + +Format: HTML +Index: /usr/share/doc/@PVER@/html/tutorial/index.html +Files: /usr/share/doc/@PVER@/html/tutorial/*.html --- python3.4-3.4.1.orig/debian/PVER-doc.overrides.in +++ python3.4-3.4.1/debian/PVER-doc.overrides.in @@ -0,0 +1,2 @@ +# this is referenced by the html docs +@PVER@-doc binary: extra-license-file --- python3.4-3.4.1.orig/debian/PVER-examples.overrides.in +++ python3.4-3.4.1/debian/PVER-examples.overrides.in @@ -0,0 +1,2 @@ +# don't care about permissions of the example files +@PVER@-examples binary: executable-not-elf-or-script --- python3.4-3.4.1.orig/debian/PVER-minimal.README.Debian.in +++ python3.4-3.4.1/debian/PVER-minimal.README.Debian.in @@ -0,0 +1,169 @@ +Contents of the @PVER@-minimal package +----------------------------------------- + +@PVER@-minimal consists of a minimum set of modules which may be needed +for python scripts used during the boot process. If other packages +are needed in these scripts, don't work around the missing module, but +file a bug report against this package. The modules in this package +are: + + __builtin__ builtin + __future__ module + _ast extension + _bisect extension + _bootlocale module + _bytesio builtin + _codecs builtin + _collections extension + _collections_abc module + _compat_pickle module + _datetime extension + _elementtree extension + _fileio builtin + _functools extension + _hashlib extensionx + _heapq extension + _imp builtin + _io builtin + _locale extension + _md5 extension + _opcode extension + _operator extension + _pickle extension + _posixsubprocess extension + _random extension + _sha1 extension + _sha3 extension + _sha256 extension + _sha512 extension + _sitebuiltins module + _socket extension + _sre builtin + _ssl extensionx + _stat extension + _stringio extension + _struct extension + _string builtin + _stringio builtin + _symtable builtin + _sysconfigdata module + _thread builtin + _threading_local module + _types builtin + _weakref builtin + _weakrefset module + _warnings builtin + configparser module + abc module + argparse module + array extension + ast module + atexit extension + base64 module + binascii extension + bisect module + builtins builtin + codecs module + collections package + compileall module + contextlib module + copy module + copyreg module + dis module + encodings package + enum module + errno builtin + exceptions builtin + fcntl extension + fnmatch module + functools module + gc builtin + genericpath module + getopt module + glob module + grp extension + hashlib module + heapq module + imp module + importlib package + inspect module + io module + itertools extension + keyword module + linecache module + locale module + logging package + marshal builtin + math extension + opcode module + operator module + optparse module + os module + pickle module + pkgutil module + platform module + posix builtin + posixpath module + pwd builtin + pyexpat extension + py_compile module + random module + re module + reprlib module + runpy module + select extension + selectors module + signal builtin + socket module + spwd extension + sre_compile module + sre_constants module + sre_parse module + ssl module + stat module + string module + struct module + subprocess module + sys builtin + sysconfig module + syslog extension + tempfile module + textwrap module + threading module + time extension + token module + tokenize module + traceback module + types module + unicodedata extension + warnings module + weakref module + zipimport extension + zlib extension + +Included are as well the codecs and stringprep modules, and the encodings +modules for all encodings except the multibyte encodings and the bz2 codec. + +The following modules are excluded, their import is guarded from the +importing module: + + Used in Excluded + ------------ ------------------------------------ + io _dummy_thread + os nt ntpath os2 os2emxpath mac macpath + riscos riscospath riscosenviron + optparse gettext + pickle doctest + subprocess threading_dummy + +This list was derived by looking at the modules in the perl-base package, +then adding python specific "core modules". + +TODO's +------ + +- time.strptime cannot be used. The required _strptime module is not + included in the -minimal package yet. _strptime, locale, _locale and + calendar have to be added. + +- modules used very often in the testsuite: copy, cPickle, operator. --- python3.4-3.4.1.orig/debian/PVER-minimal.postinst.in +++ python3.4-3.4.1/debian/PVER-minimal.postinst.in @@ -0,0 +1,82 @@ +#! /bin/sh + +set -e + +if [ ! -f /etc/@PVER@/sitecustomize.py ]; then + cat <<-EOF + # Empty sitecustomize.py to avoid a dangling symlink +EOF +fi + +case "$1" in + configure) + # Create empty directories in /usr/local + if [ ! -e /usr/local/lib/@PVER@ ]; then + mkdir -p /usr/local/lib/@PVER@ 2> /dev/null || true + chmod 2775 /usr/local/lib/@PVER@ 2> /dev/null || true + chown root:staff /usr/local/lib/@PVER@ 2> /dev/null || true + fi + localsite=/usr/local/lib/@PVER@/dist-packages + if [ ! -e $localsite ]; then + mkdir -p $localsite 2> /dev/null || true + chmod 2775 $localsite 2> /dev/null || true + chown root:staff $localsite 2> /dev/null || true + fi + + if which update-binfmts >/dev/null; then + update-binfmts --import @PVER@ + fi + + ;; +esac + +if [ "$1" = configure ]; then + + # only available before removal of the packaging package + rm -f /etc/@PVER@/sysconfig.cfg + + if ls -L /usr/lib/@PVER@/sitecustomize.py >/dev/null 2>&1; then + filt='cat' + else + filt='fgrep -v sitecustomize.py' + fi + files=$(dpkg -L lib@PVER@-minimal@HOST_QUAL@ \ + | sed -n '/^\/usr\/lib\/@PVER@\/.*\.py$/p' | $filt) + if [ -n "$files" ]; then + @PVER@ -E -S /usr/lib/@PVER@/py_compile.py $files + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config; then + @PVER@ -E -S -O /usr/lib/@PVER@/py_compile.py $files + fi + else + echo >&2 "@PVER@-minimal: can't get files for byte-compilation" + fi + bc=no + #if [ -z "$2" ] || dpkg --compare-versions "$2" lt 2.5-3 \ + # || [ -f /var/lib/python/@PVER@_installed ]; then + # bc=yes + #fi + if ! grep -sq '^supported-versions[^#]*@PVER@' /usr/share/python/debian_defaults + then + # FIXME: byte compile anyway? + bc=no + fi + if [ "$bc" = yes ]; then + # new installation or installation of first version with hook support + if [ "$DEBIAN_FRONTEND" != noninteractive ]; then + echo "Linking and byte-compiling packages for runtime @PVER@..." + fi + version=$(dpkg -s @PVER@-minimal | awk '/^Version:/ {print $2}') + for hook in /usr/share/python3/runtime.d/*.rtinstall; do + [ -x $hook ] || continue + $hook rtinstall @PVER@ "$2" "$version" + done + if [ -f /var/lib/python/@PVER@_installed ]; then + rm -f /var/lib/python/@PVER@_installed + rmdir --ignore-fail-on-non-empty /var/lib/python 2>/dev/null + fi + fi +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER-minimal.postrm.in +++ python3.4-3.4.1/debian/PVER-minimal.postrm.in @@ -0,0 +1,13 @@ +#! /bin/sh + +set -e + +if [ "$1" = "remove" ]; then + + if [ -f /var/lib/python/@PVER@_installed ]; then + rm -f /var/lib/python/@PVER@_installed + rmdir --ignore-fail-on-non-empty /var/lib/python 2>/dev/null + fi + + rmdir --parents /usr/local/lib/@PVER@ 2>/dev/null || true +fi --- python3.4-3.4.1.orig/debian/PVER-minimal.preinst.in +++ python3.4-3.4.1/debian/PVER-minimal.preinst.in @@ -0,0 +1,26 @@ +#!/bin/sh + +set -e + +case "$1" in + install) + # remember newly installed runtime + mkdir -p /var/lib/python + touch /var/lib/python/@PVER@_installed + ;; + upgrade) + : + ;; + + abort-upgrade) + ;; + + *) + echo "preinst called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER-minimal.prerm.in +++ python3.4-3.4.1/debian/PVER-minimal.prerm.in @@ -0,0 +1,36 @@ +#! /bin/sh + +set -e + +case "$1" in + remove) + if [ "$DEBIAN_FRONTEND" != noninteractive ]; then + echo "Unlinking and removing bytecode for runtime @PVER@" + fi + for hook in /usr/share/python3/runtime.d/*.rtremove; do + [ -x $hook ] || continue + $hook rtremove @PVER@ || continue + done + + if which update-binfmts >/dev/null; then + update-binfmts --package @PVER@ --remove @PVER@ /usr/bin/@PVER@ + fi + + localsite=/usr/local/lib/@PVER@/dist-packages + [ -d $localsite ] && rmdir $localsite 2>/dev/null || true + [ -d $(dirname $localsite) ] \ + && rmdir $(dirname $localsite) 2>/dev/null || true + ;; + upgrade) + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/PVER-venv.postinst.in +++ python3.4-3.4.1/debian/PVER-venv.postinst.in @@ -0,0 +1,20 @@ +#! /bin/sh + +set -e + +case "$1" in + configure) + files=$(dpkg -L @PVER@-venv | sed -n '/^\/usr\/lib\/@PVER@\/.*\.py$/p') + if [ -n "$files" ]; then + @PVER@ -E -S /usr/lib/@PVER@/py_compile.py $files + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config; then + @PVER@ -E -S -O /usr/lib/@PVER@/py_compile.py $files + fi + else + echo >&2 "@PVER@: can't get files for byte-compilation" + fi +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER-venv.prerm.in +++ python3.4-3.4.1/debian/PVER-venv.prerm.in @@ -0,0 +1,35 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars="$max" echo \ + | while read files; do rm -f $files; done + + find /usr/lib/@PVER@/ensurepip \ + -name __pycache__ -type d -empty -print \ + | xargs -r rm -rf +} + +case "$1" in + remove) + remove_bytecode @PVER@-venv + ;; + upgrade) + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/PVER.desktop.in +++ python3.4-3.4.1/debian/PVER.desktop.in @@ -0,0 +1,10 @@ +[Desktop Entry] +Name=Python (v@VER@) +Comment=Python Interpreter (v@VER@) +Exec=/usr/bin/@PVER@ +Icon=/usr/share/pixmaps/@PVER@.xpm +Terminal=true +Type=Application +Categories=Development; +StartupNotify=true +NoDisplay=true --- python3.4-3.4.1.orig/debian/PVER.menu.in +++ python3.4-3.4.1/debian/PVER.menu.in @@ -0,0 +1,4 @@ +?package(@PVER@):needs="text" section="Applications/Programming"\ + title="Python (v@VER@)"\ + icon="/usr/share/pixmaps/@PVER@.xpm"\ + command="/usr/bin/python@VER@" --- python3.4-3.4.1.orig/debian/PVER.overrides.in +++ python3.4-3.4.1/debian/PVER.overrides.in @@ -0,0 +1,8 @@ +# yes, we have to +@PVER@ binary: depends-on-python-minimal + +@PVER@ binary: desktop-command-not-in-package +@PVER@ binary: menu-command-not-in-package + +# no, not useless +@PVER@ binary: manpage-has-useless-whatis-entry --- python3.4-3.4.1.orig/debian/PVER.postinst.in +++ python3.4-3.4.1/debian/PVER.postinst.in @@ -0,0 +1,20 @@ +#! /bin/sh + +set -e + +case "$1" in + configure) + files=$(dpkg -L lib@PVER@-stdlib@HOST_QUAL@ | sed -n '/^\/usr\/lib\/@PVER@\/.*\.py$/p') + if [ -n "$files" ]; then + @PVER@ -E -S /usr/lib/@PVER@/py_compile.py $files + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config; then + @PVER@ -E -S -O /usr/lib/@PVER@/py_compile.py $files + fi + else + echo >&2 "@PVER@: can't get files for byte-compilation" + fi +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/PVER.prerm.in +++ python3.4-3.4.1/debian/PVER.prerm.in @@ -0,0 +1,31 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars="$max" echo \ + | while read files; do rm -f $files; done + find /usr/lib/python3 /usr/lib/@PVER@ -name dist-packages -prune -o -name __pycache__ -empty -print \ + | xargs -r rm -rf +} + +case "$1" in + remove|upgrade) + remove_bytecode lib@PVER@-stdlib@HOST_QUAL@ + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/README.Debian.in +++ python3.4-3.4.1/debian/README.Debian.in @@ -0,0 +1,8 @@ +The documentation for this package is in /usr/share/doc/@PVER@/. + +A draft of the "Debian Python Policy" can be found in + + /usr/share/doc/python + +Sometime it will be moved to /usr/share/doc/debian-policy in the +debian-policy package. --- python3.4-3.4.1.orig/debian/README.PVER.in +++ python3.4-3.4.1/debian/README.PVER.in @@ -0,0 +1,95 @@ + + Python @VER@ for Debian + --------------------- + +This is Python @VER@ packaged for Debian. + +This document contains information specific to the Debian packages of +Python @VER@. + + + + [TODO: This document is not yet up-to-date with the packages.] + +Currently, it features those two main topics: + + 1. Release notes for the Debian packages: + 2. Notes for developers using the Debian Python packages: + +Release notes and documentation from the upstream package are installed +in /usr/share/doc/@PVER@/. + +There's a mailing list for discussion of issues related to Python on Debian +systems: debian-python@lists.debian.org. The list is not intended for +general Python problems, but as a forum for maintainers of Python-related +packages and interested third parties. + + + +1. Release notes for the Debian packages: + + +Results of the regression test: +------------------------------ + +The package does successfully run the regression tests for all included +modules. Seven packages are skipped since they are platform-dependent and +can't be used with Linux. + + +2. Notes for developers using the Debian python packages: + +See the draft of the Debian Python policy in /usr/share/doc/python. + +distutils can be found in the @PVER@-dev package. Development files +like the python library or Makefiles can be found in the @PVER@-dev +package in /usr/lib/@PVER@/config. Therefore, if you need to install +a pure python extension, you only need @PVER@. On the other hand, to +install a C extension, you need @PVER@-dev. + +a) Locally installed Python add-ons + + /usr/local/lib/@PVER@/site-packages/ + /usr/local/lib/site-python/ (version-independent modules) + +b) Python add-ons packaged for Debian + + /usr/lib/@PVER@/site-packages/ + /usr/lib/site-python/ (version-independent modules) + +Note that no package must install files directly into /usr/lib/@PVER@/ +or /usr/local/lib/@PVER@/. Only the site-packages directory is allowed +for third-party extensions. + +Use of the new `package' scheme is strongly encouraged. The `ni' interface +is obsolete in python 1.5. + +Header files for extensions go into /usr/include/@PVER@/. + + +Installing extensions for local use only: +---------------------------------------- + +Consider using distutils ... + +Most extensions use Python's Makefile.pre.in. Note that Makefile.pre.in +by default will install files into /usr/lib/, not into /usr/local/lib/, +which is not allowed for local extensions. You'll have to change the +Makefile accordingly. Most times, "make prefix=/usr/local install" will +work. + + +Packaging python extensions for Debian: +-------------------------------------- + +Maintainers of Python extension packages should read + + /usr/share/doc/python/python-policy.txt.gz + + + + + 03/09/98 + Gregor Hoffleit + +Last change: 2001-12-14 --- python3.4-3.4.1.orig/debian/README.Tk +++ python3.4-3.4.1/debian/README.Tk @@ -0,0 +1,8 @@ +Tkinter documentation can be found at + + http://www.pythonware.com/library/index.htm + +more specific: + + http://www.pythonware.com/library/tkinter/introduction/index.htm + http://www.pythonware.com/library/tkinter/an-introduction-to-tkinter.pdf --- python3.4-3.4.1.orig/debian/README.dbm +++ python3.4-3.4.1/debian/README.dbm @@ -0,0 +1,72 @@ + + Python and dbm modules on Debian + -------------------------------- + +This file documents the configuration of the dbm modules for Debian. It +gives hints at the preferred use of the dbm modules. + + +The preferred way to access dbm databases in Python is the anydbm module. +dbm databases behave like mappings (dictionaries). + +Since there exist several dbm database formats, we choose the following +layout for Python on Debian: + + * creating a new database with anydbm will create a Berkeley DB 2.X Hash + database file. This is the standard format used by libdb starting + with glibc 2.1. + + * opening an existing database with anydbm will try to guess the format + of the file (using whichdb) and then load it using one of the bsddb, + bsddb1, gdbm or dbm (only if the python-gdbm package is installed) + or dumbdbm modules. + + * The modules use the following database formats: + + - bsddb: Berkeley DB 2.X Hash (as in libc6 >=2.1 or libdb2) + - bsddb1: Berkeley DB 1.85 Hash (as in libc6 >=2.1 or libdb2) + - gdbm: GNU dbm 1.x or ndbm + - dbm: " (nearly the same as the gdbm module for us) + - dumbdbm: a hand-crafted format only used in this module + + That means that all usual formats should be readable with anydbm. + + * If you want to create a database in a format different from DB 2.X, + you can still directly use the specified module. + + * I.e. bsddb is the preferred module, and DB 2.X is the preferred format. + + * Note that the db1hash and bsddb1 modules are Debian specific. anydbm + and whichdb have been modified to support DB 2.X Hash files (see + below for details). + + + +For experts only: +---------------- + +Although bsddb employs the new DB 2.X format and uses the new Sleepycat +DB 2 library as included with glibc >= 2.1, it's still using the old +DB 1.85 API (which is still supported by DB 2). + +A more recent version 1.1 of the BSD DB module (available from +http://starship.skyport.net/robind/python/) directly uses the DB 2.X API. +It has a richer set of features. + + +On a glibc 2.1 system, bsddb is linked with -ldb, bsddb1 is linked with +-ldb1 and gdbm as well as dbm are linked with -lgdbm. + +On a glibc 2.0 system (e.g. potato for m68k or slink), bsddb will be +linked with -ldb2 while bsddb1 will be linked with -ldb (therefore +python-base here depends on libdb2). + + +db1hash and bsddb1 nearly completely identical to dbhash and bsddb. The +only difference is that bsddb is linked with the real DB 2 library, while +bsddb1 is linked with an library which provides compatibility with legacy +DB 1.85 databases. + + + July 16, 1999 + Gregor Hoffleit --- python3.4-3.4.1.orig/debian/README.idle-PVER.in +++ python3.4-3.4.1/debian/README.idle-PVER.in @@ -0,0 +1,14 @@ + + The Python IDLE package for Debian + ---------------------------------- + +This package contains Python @VER@'s Integrated DeveLopment Environment, IDLE. + +IDLE is included in the Python @VER@ upstream distribution (Tools/idle) and +depends on Tkinter (available as @PVER@-tk package). + +I have written a simple man page. + + + 06/16/1999 + Gregor Hoffleit --- python3.4-3.4.1.orig/debian/README.maintainers.in +++ python3.4-3.4.1/debian/README.maintainers.in @@ -0,0 +1,88 @@ + +Hints for maintainers of Debian packages of Python extensions +------------------------------------------------------------- + +Most of the content of this README can be found in the Debian Python policy. +See /usr/share/doc/python/python-policy.txt.gz. + +Documentation Tools +------------------- + +If your package ships documentation produced in the Python +documentation format, you can generate it at build-time by +build-depending on @PVER@-dev, and you will find the +templates, tools and scripts in /usr/lib/@PVER@/doc/tools -- +adjust your build scripts accordingly. + + +Makefile.pre.in issues +---------------------- + +Python comes with a `universal Unix Makefile for Python extensions' in +/usr/lib/@PVER@/config/Makefile.pre.in (with Debian, this is included +in the python-dev package), which is used by most Python extensions. + +In general, packages using the Makefile.pre.in approach can be packaged +simply by running dh_make or by using one of debhelper's rules' templates +(see /usr/doc/debhelper/examples/). Makefile.pre.in works fine with e.g. +"make prefix=debian/tmp/usr install". + +One glitch: You may be running into the problem that Makefile.pre.in +doesn't try to create all the directories when they don't exist. Therefore, +you may have to create them manually before "make install". In most cases, +the following should work: + + ... + dh_installdirs /usr/lib/@PVER@ + $(MAKE) prefix=debian/tmp/usr install + ... + + +Byte-compilation +---------------- + +For speed reasons, Python internally compiles source files into a byte-code. +To speed up subsequent imports, it tries to save the byte-code along with +the source with an extension .pyc (resp. pyo). This will fail if the +libraries are installed in a non-writable directory, which may be the +case for /usr/lib/@PVER@/. + +Not that .pyc and .pyo files should not be relocated, since for debugging +purposes the path of the source for is hard-coded into them. + +To precompile files in batches after installation, Python has a script +compileall.py, which compiles all files in a given directory tree. The +Debian version of compileall has been enhanced to support incremental +compilation and to feature a ddir (destination dir) option. ddir is +used to compile files in debian/usr/lib/python/ when they will be +installed into /usr/lib/python/. + + +Currently, there are two ways to use compileall for Debian packages. The +first has a speed penalty, the second has a space penalty in the package. + +1.) Compiling and removing .pyc files in postinst/prerm: + + Use dh_python(1) from the debhelper packages to add commands to byte- + compile on installation and to remove the byte-compiled files on removal. + Your package has to build-depend on: debhelper (>= 4.1.67), python. + + In /usr/share/doc/@PVER@, you'll find sample.postinst and sample.prerm. + If you set the directory where the .py files are installed, these + scripts will install and remove the .pyc and .pyo files for your + package after unpacking resp. before removing the package. + +2.) Compiling the .pyc files `out of place' during installation: + + As of 1.5.1, compileall.py allows you to specify a faked installation + directory using the "-d destdir" option, so that you can precompile + the files in their temporary directory + (e.g. debian/tmp/usr/lib/python2.1/site-packages/PACKAGE). + + + + 11/02/98 + Gregor Hoffleit + + +Last modified: 2007-10-14 --- python3.4-3.4.1.orig/debian/README.python +++ python3.4-3.4.1/debian/README.python @@ -0,0 +1,153 @@ + + Python 2.x for Debian + --------------------- + +This is Python 2.x packaged for Debian. + +This document contains information specific to the Debian packages of +Python 2.x. + + + + [TODO: This document is not yet up-to-date with the packages.] + + + + + + +Currently, it features those two main topics: + + 1. Release notes for the Debian packages: + 2. Notes for developers using the Debian Python packages: + +Release notes and documentation from the upstream package are installed +in /usr/share/doc/python/. + +Up-to-date information regarding Python on Debian systems is also +available as http://www.debian.org/~flight/python/. + +There's a mailing list for discussion of issues related to Python on Debian +systems: debian-python@lists.debian.org. The list is not intended for +general Python problems, but as a forum for maintainers of Python-related +packages and interested third parties. + + + +1. Release notes for the Debian packages: + + +Results of the regression test: +------------------------------ + +The package does successfully run the regression tests for all included +modules. Seven packages are skipped since they are platform-dependent and +can't be used with Linux. + + +Noteworthy changes since the 1.4 packages: +----------------------------------------- + +- Threading support enabled. +- Tkinter for Tcl/Tk 8.x. +- New package python-zlib. +- The dbmmodule was dropped. Use bsddb instead. gdbmmodule is provided + for compatibility's sake. +- python-elisp adheres to the new emacs add-on policy; it now depends + on emacsen. python-elisp probably won't work correctly with emacs19. + Refer to /usr/doc/python-elisp/ for more information. +- Remember that 1.5 has dropped the `ni' interface in favor of a generic + `packages' concept. +- Python 1.5 regression test as additional package python-regrtest. You + don't need to install this package unless you don't trust the + maintainer ;-). +- once again, modified upstream's compileall.py and py_compile.py. + Now they support compilation of optimized byte-code (.pyo) for use + with "python -O", removal of .pyc and .pyo files where the .py source + files are missing (-d) and finally the fake of a installation directory + when .py files have to be compiled out of place for later installation + in a different directory (-i destdir, used in ./debian/rules). +- The Debian packages for python 1.4 do call + /usr/lib/python1.4/compileall.py in their postrm script. Therefore + I had to provide a link from /usr/lib/python1.5/compileall.py, otherwise + the old packages won't be removed completely. THIS IS A SILLY HACK! + + + +2. Notes for developers using the Debian python packages: + + +Embedding python: +---------------- + +The files for embedding python resp. extending the python interpreter +are included in the python-dev package. With the configuration in the +Debian GNU/Linux packages of python 1.5, you will want to use something +like + + -I/usr/include/python1.5 (e.g. for config.h) + -L/usr/lib/python1.5/config -lpython1.5 (... -lpthread) + (also for Makefile.pre.in, Setup etc.) + +Makefile.pre.in automatically gets that right. Note that unlike 1.4, +python 1.5 has only one library, libpython1.5.a. + +Currently, there's no shared version of libpython. Future version of +the Debian python packages will support this. + + +Python extension packages: +------------------------- + +According to www.python.org/doc/essays/packages.html, extension packages +should only install into /usr/lib/python1.5/site-packages/ (resp. +/usr/lib/site-python/ for packages that are definitely version independent). +No extension package should install files directly into /usr/lib/python1.5/. + +But according to the FSSTND, only Debian packages are allowed to use +/usr/lib/python1.5/. Therefore Debian Python additionally by default +searches a second hierarchy in /usr/local/lib/. These directories take +precedence over their equivalents in /usr/lib/. + +a) Locally installed Python add-ons + + /usr/local/lib/python1.5/site-packages/ + /usr/local/lib/site-python/ (version-independent modules) + +b) Python add-ons packaged for Debian + + /usr/lib/python1.5/site-packages/ + /usr/lib/site-python/ (version-independent modules) + +Note that no package must install files directly into /usr/lib/python1.5/ +or /usr/local/lib/python1.5/. Only the site-packages directory is allowed +for third-party extensions. + +Use of the new `package' scheme is strongly encouraged. The `ni' interface +is obsolete in python 1.5. + +Header files for extensions go into /usr/include/python1.5/. + + +Installing extensions for local use only: +---------------------------------------- + +Most extensions use Python's Makefile.pre.in. Note that Makefile.pre.in +by default will install files into /usr/lib/, not into /usr/local/lib/, +which is not allowed for local extensions. You'll have to change the +Makefile accordingly. Most times, "make prefix=/usr/local install" will +work. + + +Packaging python extensions for Debian: +-------------------------------------- + +Maintainers of Python extension packages should read README.maintainers. + + + + + 03/09/98 + Gregor Hoffleit + +Last change: 07/16/1999 --- python3.4-3.4.1.orig/debian/README.source +++ python3.4-3.4.1/debian/README.source @@ -0,0 +1,7 @@ +The source tarball is lacking the files Lib/profile.py and Lib/pstats.py, +which Debian considers to have a license non-suitable for main (the use +of these modules limited to python). + +The package uses quilt to apply / unapply patches. +See /usr/share/doc/quilt/README.source. The series file is generated +during the build. --- python3.4-3.4.1.orig/debian/README.venv +++ python3.4-3.4.1/debian/README.venv @@ -0,0 +1,230 @@ +========================================= + pyvenv support in Python 3.4 and beyond +========================================= + +In Python 3.3, built-in support for virtual environments (venvs) was added via +the `pyvenv`_ command. For building venvs using Python 3, this is +functionally equivalent to the standalone `virtualenv`_ tool, except that +before Python 3.4, the pyvenv created venv didn't include pip and setuptools. + +In Python 3.4, this was made even more convenient by the `automatic +inclusion`_ of the `pip`_ command into the venv so that third party libraries +can be easily installed from the Python Package Index (PyPI_). The stdlib +module `ensurepip`_ is run when the `pyvenv-3.4` command is run to create the +venv. + +This poses a problem for Debian. ensurepip comes bundled with two third party +libraries, setuptools and pip itself, as these are requirements for pip to +function properly in the venv. These are bundled in the ensurepip module of +the upstream Python 3.4 tarball as `universal wheels`_, essentially a zip of +the source code and a new ``dist-info`` metadata directory. Upstream pip +itself comes bundled with a half dozen or so of *its* dependencies, except +that these are "vendorized", meaning their unpacked source code lives within +the pip module, under a submodule from which pip imports them rather than the +top-level package namespace. + +To make matters worse, one of pip's vendorized dependencies, the `requests`_ +module, *also* vendorizes a bunch of its own dependencies. This stack of +vendorized and bundled third party libraries fundamentally violates the DFSG +and Debian policy against including code not built from source available +within Debian, and for including embedded "convenience" copies of code in +other packages. + +It's worth noting that the virtualenv package actually suffers from the same +conflict, but its current solution in Debian is `incomplete`_. + + +Solving the conflict +==================== + +This conflict between Debian policy and upstream Python convenience must be +resolved, because pyvenv is the recommended way of creating venvs in Python 3, +and because at some point, the standalone virtualenv tool will be rewritten as +a thin layer above pyvenv. Obviously, we want to provide the best Python +virtual environment experience to our developers, adherent to Debian policy. + +The approach we've taken is layered and nuanced, so I'll provide a significant +amount of detail to explain both what we do and why. + +The first thing to notice is how upstream ensurepip works to have its pip and +setuptools dependencies available, both at venv creation time and when +``/bin/pip`` is run. When pyvenv-3.4 runs, it ends up calling the +following Python command:: + + /bin/python -Im ensurepip --upgrade + +This runs the ensurepip's ``__main__.py`` module using the venv's Python in +isolation mode, with a switch to upgrade the setuptools and pip dependencies +(if for example, they've been updated in a new micro version of Python). + +Internally, ensurepip bootstraps itself by byte-copying its embedded wheels +into a temporary directory, putting those copied wheels on ``sys.path``, and +then calling into pip as a library. Because wheels are just elaborate zips, +Python can execute (pure-Python) code directly from them, if they are on +``sys.path`` of course. Once ensurepip has set up its execution environment, +it calls into pip to install both pip and setuptools into the newly created +venv. If you poke inside the venv after successful creation, you'll see +unpacked pip and setuptools directories in the venv's ``site-packages` +directory. + +The important thing to note here is that ensurepip is *already* able to import +from and install wheels, and because wheels are self-contained single files +(of zipped content), it makes manipulating them quite easy. In order to +minimize the delta from upstream (and to eventually work with upstream to +eliminate this delta), it seems optimal that Debian's solution should also be +based on wheels, and re-use as much of the existing machinery as possible. + +The difference for Debian though is that we don't want to use the embedded pip +and setuptools wheels from upstream Python's ensurepip; we want to use wheels +created from the pip and setuptools *Debian* packages. This would solve the +problem of distributing binary packages not built from source in Debian. + +Thus, we modify the python-pip and python-setuptools packages to include new +binary packages ``python-pip-whl`` and ``python-setuptools-whl` which contain +*only* the relevant universal wheels. Those packages ``debian/rules`` files +gain an extra command:: + + python3 setup.py bdist_wheel --universal -d + +The ``bdist_wheel`` command is provided by the `wheel`_ package, which as of +this writing is newly available in Jessie. + +Note that the name of the binary packages, and other details of when and how +wheels may be used in Debian, is described in `Debian Python Policy`_ 0.9.6 or +newer. + +The universal wheels (i.e. pure-Python code compatible with both Python 2 and +Python 3) are built for pip and setuptools and installed into +``/usr/share/python-wheels`` when the python-{pip,setuptols}-whl packages are +installed. These are not needed for normal, usual, and typical operation of +Python, so none of these are installed by default. + +However, this isn't enough, because since the pip and setuptools wheels are +built from the *patched* and de-vendorized versions of the code in Debian, the +wheels will not contain their own recursive dependencies. That's a good thing +for Debian policy compliance, but does add complications to the stack of hack. + +Using the same approach as for pip and setuptools, we *also* wheelify their +dependencies, recursively. As of this writing, the list of packages needing +to be wheelified are (by Debian source package name): + + * chardet + * distlib + * html5lib + * python-colorama + * python-pip + * python-setuptools + * python-urllib3 + * requests + * six + +Most of these are DPMT maintained. six, distlib, and colorama are not team +maintained, so coordination with those maintainers is required. Also note +that the `bdist_wheel` command is a setuptools extension, so since some of +those projects use ``distutils.core.setup()`` by default, they must be patched +to use ``setuptools.setup()`` instead. This isn't a problem because there's +no functional difference relevant to those packages; they likely use +distutils.core to avoid a third party dependency on setuptools. + +Each of these Debian source packages grow an additional binary package, just +like pip and setuptools, e.g. python-chardet-whl which contains the universal +wheel for that package built from patched Debian source. As above, when +installed, these binary packages drop their .whl files into the +``/usr/share/python-wheels`` directory. + +Now comes the fun part. + +In the python3.4 source package, we add a new binary package called +python3.4-venv. This will only contain the ``/usr/bin/pyvenv-3.4`` +executable, and its associated manpage. It also includes all the run-time +dependencies to make pyvenv work *including the wheel packages described +above*. + +(Technically speaking, you should substitute "Python 3.4 or later" for all +these discussions, and e.g. pyvenv-3.x for all versions subsequent to 3.4.) + +Python's ensurepip module has been modified in the following ways (see +``debian/patches/ensurepip.diff``): + + * When ensurepip is run outside of a venv as root, it raises an exception. + This use case is only to be supported by the separate python{,3}-pip + packages. + + * When ensurepip is run inside of a venv, it copies all dependent wheels from + ``/usr/share/python-wheels``. This includes the direct dependencies pip + and setuptools, as well as the recursive dependencies listed above. The + rest of the ensurepip machinery is unchanged: the wheels are still copied + into a temporary directory and placed on ``sys.path``, however only the + direct dependencies (i.e. pip and setuptools) are *installed* into the + venv's ``site-packages`` directory. The indirect dependencies are copied + to ``/lib/python-wheels`` since they'll be needed by the venv's pip + executable. + +Why do we do this latter rather than also installing the recursive +dependencies into the venv's ``site-packages``? It's because pip requires a +very specific set of dependencies and we don't want pip to break when the user +upgrades or downgrades one of those packages, which is perfectly valid in a +venv. It's exactly the same reason why pip vendorizes those libraries in the +first place; it's just that we're doing it in a more principled way (from the +point of view of the Debian distribution). + +The final piece of the puzzle is that Debian's pip will, when run inside of a +venv, introspect ``/lib/python-wheels`` and put every .whl file it sees +there *at the front of its sys.path*. Again, this is so that when pip runs, +it will find the versions of packages known to be good first, rather than any +other versions in the venv's ``site-packages``. + +As an example of the bad things that can happen if you don't do this, try +installing nose2_ into the venv, followed by genshi_. nose2 has a hard +requirement on a version of six that is older than the one used by pip +(indirectly). This older version of six is compatible with genshi, but *not* +with pip, so once nose2 is installed, if pip didn't load its version of six +from the private wheel, the installation attempt of genshi would traceback. +As it is, with the wheels early enough on ``sys.path``, pip itself works just +fine so that both nose2 and genshi can live together in the venv. + + +Updating packages +================= + +Inevitably, new versions of Python or the pyvenv dependent packages will +appear. Unfortunately, as currently implemented (by both upstream ensurepip +and in our ensurepip patch), the versions of both the direct and indirect +dependencies are hardcoded in ``Lib/ensurepip/__init__.py``. When a Debian +developer updates any of the dependent packages, you will need to: + + * *Test that the new version is compatible with ensurepip*. + + * Update the version numbers in the ``debian/control`` file, for the + python3.x-venv binary package. + + * ``quilt push`` to the ensurepip patch, and update the version number in + ``Lib/ensurepip/__init__.py`` + +Then rebuild and upload python3.4. + +Yes, this isn't ideal, and I am working with upstream to find a good solution +that we can share. + + +Author +====== + +Barry A. Warsaw +2014-05-15 + + + +.. _pyvenv: http://legacy.python.org/dev/peps/pep-0405/ +.. _virtualenv: https://pypi.python.org/pypi/virtualenv +.. _`automatic inclusion`: http://legacy.python.org/dev/peps/pep-0453/ +.. _pip: https://pypi.python.org/pypi/pip +.. _PyPI: https://pypi.python.org/pypi +.. _ensurepip: https://docs.python.org/3/library/ensurepip.html +.. _`universal wheels`: http://legacy.python.org/dev/peps/pep-0427/ +.. _requests: https://pypi.python.org/pypi/requests +.. _incomplete: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=719767 +.. _wheel: https://pypi.python.org/pypi/wheel +.. _nose2: https://pypi.python.org/pypi/nose2 +.. _genshi: https://pypi.python.org/pypi/Genshi +.. _`Debian Python Policy`: https://www.debian.org/doc/packaging-manuals/python-policy/ --- python3.4-3.4.1.orig/debian/_sysconfigdata.py +++ python3.4-3.4.1/debian/_sysconfigdata.py @@ -0,0 +1,6 @@ +import sys + +if hasattr(sys, 'gettotalrefcount'): + from _sysconfigdata_dm import * +else: + from _sysconfigdata_m import * --- python3.4-3.4.1.orig/debian/changelog +++ python3.4-3.4.1/debian/changelog @@ -0,0 +1,2994 @@ +python3.4 (3.4.1-8) unstable; urgency=medium + + * Update to 20140726 from the 3.4 branch. + * Move turtledemo from libpython3.4-testsuite to python3.4-examples. + * Call dpkg -L in the maintainer scripts with an architecture qualifier + for M-A: same packages. Closes: #754914. + + -- Matthias Klose Sat, 26 Jul 2014 14:16:56 +0200 + +python3.4 (3.4.1-7) unstable; urgency=medium + + * Update to 20140706 from the 3.4 branch. + + -- Matthias Klose Sun, 06 Jul 2014 21:37:36 +0200 + +python3.4 (3.4.1-6) unstable; urgency=medium + + * Fix logic to disable running the pystone benchmark on KFreeBSD (Steven + Chamberlain). + + -- Matthias Klose Mon, 09 Jun 2014 12:06:18 +0200 + +python3.4 (3.4.1-5) unstable; urgency=medium + + * Update to 20140608 from the 3.4 branch. + * Disable running the pystone benchmark on KFreeBSD. + + -- Matthias Klose Sun, 08 Jun 2014 11:43:44 +0200 + +python3.4 (3.4.1-4) unstable; urgency=medium + + * Disable running the pybench benchmark on KFreeBSD. + + -- Matthias Klose Sat, 07 Jun 2014 14:01:35 +0200 + +python3.4 (3.4.1-3) unstable; urgency=medium + + * Update to 20140605 from the 3.4 branch. + - pull in pyvenv changes. + * Update the ensurepip-wheels patch (Barry Warsaw). + * Fix python3.4-venv package removal. + + -- Matthias Klose Thu, 05 Jun 2014 11:57:51 +0200 + +python3.4 (3.4.1-2) unstable; urgency=medium + + * Update to 20140603 from the 3.4 branch. + * Remove the __pycache__ directories on libpython3.4-testsuite package + removal. Closes: #749999. + * In the autopkg tests, set HOME to the temporary home directory after + the su call. + * In the autopkg tests, make $ADTTMP accessible to the su user, and + re-enable the test_site autopkg test. + * Don't try to access the pip module in ensurepip, when the wheels + are not available. + + -- Matthias Klose Tue, 03 Jun 2014 23:58:48 +0200 + +python3.4 (3.4.1-1) unstable; urgency=medium + + * Python 3.4.1 release. + * Set a temporary home directory for the build and the autopkg tests. + * Fix issue #17752, test_distutils failures in the installed location. + * Update pydoc_data/topics.py, broken in the release candidate. + * Run again the test_code_module test in the autopkg tests. + * Fix issue #21264, test_compileall test failures in the installed + location. Re-enable in autopkg tests. LP: #1264554. + * ensurepip and pyvenv: + - Split out a python3.4-venv package, include the pyvenv-3.4 binary + and the ensurepip package. + - Adjust the ensurepip patch so that the wheels are installed from + the universal wheel packages (Barry Warsaw). + - Let ensurepip read wheel dependencies from a file shipped in the + -whl packages. + - Remove any version check on required pip and setuptools versions. + These are handled within these packages if necessary. + * Re-enable the pgo build. + + -- Matthias Klose Wed, 21 May 2014 22:17:32 +0200 + +python3.4 (3.4.1~rc1-1) unstable; urgency=medium + + * Python 3.4.1 release candidate 1. + * Don't run test_code_module in the autopkg test environment, fails there + but succeeds during the build. See issue #17756. Applied workaround + for the test case. + + -- Matthias Klose Mon, 05 May 2014 16:10:23 +0200 + +python3.4 (3.4.0+20140427-1) unstable; urgency=medium + + * Update to 20140427 from the 3.4 branch. + * Fix dependency for the -testsuite package: Closes: #745879. + + -- Matthias Klose Sun, 27 Apr 2014 18:48:54 +0200 + +python3.4 (3.4.0+20140425-1) unstable; urgency=medium + + * Update to 20140425 from the 3.4 branch. + * Don't try to byte-compile bad syntax files in the testsuite. + + -- Matthias Klose Fri, 25 Apr 2014 13:52:11 +0200 + +python3.4 (3.4.0+20140417-1) unstable; urgency=medium + + * Update to 20140417 from the 3.4 branch. + - Fix the test_site test failure. + * Repackage as a new tarball and remove the wheels shipped with ensurepip. + * Re-enable running some tests, disable some tests: + - Re-enable test_platform, test_subprocess, test_code_module, test_pydoc, + - Fix a distutils test error, skip a Solaris distutils test error. + - Skip the test_platform encoding test, failing with the lsb-release patch. + - Skip tests which are failing with python3.4.zip removed from sys.path. + Tracked in issue #21249. + * Byte-compile the files in the libpython3.4-testsuite package. + * d/p/distutils-install-layout.diff, d/p/site-locations.diff: Adjust the + "am I in a virtual environment" tests to include checking + sys.base_prefix != sys.prefix. This is the definitive such test for + pyvenv created virtual environments (Barry Warsaw). + * Disallow running ensurepip with the system python, when not used in + a virtual environment (Barry Warsaw). + * Don't yet install the ensurepip module, requires further work. + ensurepip wants to install bundled modules setuptools and python-pip, + which should be built from the distro packages instead of using the + bundled code. + * python3.4-dbg: Add a python3.4-dbg.py symlink. + * Remove the linecache patch, not needed anymore in 3.4. + * Remove the disable-utimes patch, not needed anymore since glibc-2.4. + * Remove the statvfs-f_flag-constants, avoid-rpath, hurd-path_max, + kfreebsd-xattrs, freebsd-testsuite and ncurses-configure patches + applied upstream. + * Don't add runtime paths for libraries found in multiarch locations. + + -- Matthias Klose Thu, 17 Apr 2014 21:05:04 +0200 + +python3.4 (3.4.0-2) unstable; urgency=medium + + * Update to 20140407 from the 3.4 branch. + - Issue #21134: Fix segfault when str is called on an uninitialized + UnicodeEncodeError, UnicodeDecodeError, or UnicodeTranslateError object. + - Issue #19537: Fix PyUnicode_DATA() alignment under m68k. + - Issue #21155: asyncio.EventLoop.create_unix_server() now raises + a ValueError if path and sock are specified at the same time. + - Issue #21149: Improved thread-safety in logging cleanup during + interpreter shutdown. + - Issue #20145: `assertRaisesRegex` and `assertWarnsRegex` now raise a + TypeError if the second argument is not a string or compiled regex. + - Issue #21058: Fix a leak of file descriptor in + :func:`tempfile.NamedTemporaryFile`, close the file descriptor if + :func:`io.open` fails. + - Issue #21013: Enhance ssl.create_default_context() when used for server + side sockets to provide better security by default. + - Issue #20633: Replace relative import by absolute import. + - Issue #20980: Stop wrapping exception when using ThreadPool. + - Issue #21082: In os.makedirs, do not set the process-wide umask. + Note this changes behavior of makedirs when exist_ok=True. + - Issue #20990: Fix issues found by pyflakes for multiprocessing. + - Issue #21015: SSL contexts will now automatically select an elliptic + curve for ECDH key exchange on OpenSSL 1.0.2 and later, and otherwise + default to "prime256v1". + - Issue #20816: Fix inspect.getcallargs() to raise correct TypeError for + missing keyword-only arguments. + - Issue #20817: Fix inspect.getcallargs() to fail correctly if more + than 3 arguments are missing. + - Issue #6676: Ensure a meaningful exception is raised when attempting + to parse more than one XML document per pyexpat xmlparser instance. + - Issue #20942: PyImport_ImportFrozenModuleObject() no longer sets __file__ + to match what importlib does; this affects _frozen_importlib as well as + any module loaded using imp.init_frozen(). + - Documentation, tools, demo and test updates. + * Depend on the python3-tk packages in the autopkg tests. + * Fix LTO builds with GCC 4.9. + + -- Matthias Klose Mon, 07 Apr 2014 11:46:02 +0200 + +python3.4 (3.4.0-1) unstable; urgency=medium + + * Python 3.4.0 release. + * Update to 20140322 from the 3.4 branch. + + * Build without ffi on or1k. Addresses: #738519. + * Update autopkg tests (Martin Pitt): + - Don't fail if apport is not installed. + - Call su with explicit shell, as nobody has nologin as default shell now. + - Only use $SUDO_USER if that user actually exists in the testbed. + - Drop obsolete chowning of $TMPDIR and $ADTTMP; with current autopkgtest + $TMPDIR has appropriate permissions, and $ADTTMP is not being used. + + -- Matthias Klose Sat, 22 Mar 2014 13:39:34 +0100 + +python3.4 (3.4~rc2-1) unstable; urgency=medium + + * Python 3.4 release candidate 2. + + -- Matthias Klose Mon, 24 Feb 2014 10:40:55 +0100 + +python3.4 (3.4~rc1-1) unstable; urgency=medium + + * Python 3.4 release candidate 1. + + -- Matthias Klose Tue, 11 Feb 2014 13:38:50 +0100 + +python3.4 (3.4~b3-1) unstable; urgency=medium + + * Python 3.4 beta 3. + + -- Matthias Klose Mon, 27 Jan 2014 10:54:10 +0000 + +python3.4 (3.4~b2-1) unstable; urgency=medium + + * Python 3.4 beta 2. + * Configure --with-system-libmpdec. + + -- Matthias Klose Sun, 05 Jan 2014 23:34:17 +0100 + +python3.4 (3.4~b1-5ubuntu2) trusty; urgency=medium + + * Disable the test_dbm autopkg test, failing from time to time ... + + -- Matthias Klose Fri, 03 Jan 2014 02:25:31 +0100 + +python3.4 (3.4~b1-5ubuntu1) trusty; urgency=medium + + * Build for Tcl/Tk 8.6. + + -- Matthias Klose Thu, 02 Jan 2014 18:05:27 +0100 + +python3.4 (3.4~b1-5) unstable; urgency=medium + + * Update to 20131231 from the trunk. + - Fix issue #20070, don't run test_urllib2net when network resources + are not enabled. + * Move the _sitebuiltins module into libpython3.4-minimal. + * distutils: On installation with --install-layout=deb, rename extensions + to include the multiarch tag. + Renaming of extensions for python3.4 is currently not done by dh-python. + See Debian #733128. + * autopkg tests: + - Update debian/tests/control to refer to python3.4. + - Generate locales for running the autopkg tests. + - Disable some currently failing autopkg tests. LP: #1264554. + - Disable test_compileall for the autopkg tests, fails only there. + * Don't run test_faulthandler on Aarch64, hangs on the buildds. + See LP: #1264354. + + -- Matthias Klose Tue, 31 Dec 2013 13:29:08 +0100 + +python3.4 (3.4~b1-4) unstable; urgency=medium + + * Update to 20131225 from the trunk. + * Fix python3.4--config --configdir. Closes: #733050. + + -- Matthias Klose Wed, 25 Dec 2013 21:56:04 +0100 + +python3.4 (3.4~b1-3) experimental; urgency=medium + + * Update to 20131221 from the trunk. + + -- Matthias Klose Sat, 21 Dec 2013 15:20:38 +0100 + +python3.4 (3.4~b1-2) experimental; urgency=low + + * Update to 20131202 from the trunk. + * Fix dbg symbols file for i386. + * Don't provide python3.4-dbm, available in a separate package. + * (Build-)depend on net-tools, test_uuid requires ifconfig. + * Fix distutils.sysconfig.get_makefile_filename(). + * Move operator module to the -minimal package. Closes: #731100. + + -- Matthias Klose Mon, 02 Dec 2013 14:36:44 +0100 + +python3.4 (3.4~b1-1) experimental; urgency=low + + * Python 3.4 beta 1. + + -- Matthias Klose Sun, 24 Nov 2013 23:21:49 +0100 + +python3.3 (3.3.3-2) unstable; urgency=low + + * Update to 20131123 from the 3.3 branch. + * Update hurd-path_max.diff. + + -- Matthias Klose Sat, 23 Nov 2013 08:57:21 +0100 + +python3.3 (3.3.3-1) unstable; urgency=low + + * Python 3.3.3 release. + * Update to 20131119 from the 3.3 branch. + * Regenerate the patches. + * Update the symbols files. + * Fix test support when the running kernel doesn't handle port reuse. + * libpython3.3-minimal replaces libpython3.3-stdlib (<< 3.2.3-7). + Closes: #725240. + + -- Matthias Klose Tue, 19 Nov 2013 08:46:55 +0100 + +python3.3 (3.3.2-7) unstable; urgency=low + + * Update to 20130918 from the 3.3 branch. + * Update symbols file. + * Fail the build if extensions for the minimal package are not in + the libpython-minimal package. Closes: #723624. + * Fix indentation in regenerated platform-lsbrelease.diff (Dmitry Shachnev). + LP: #1220508. + * Point to the python3-tk (instead of the python-tk) package when missing. + LP: #1184082. + + -- Matthias Klose Wed, 18 Sep 2013 12:19:47 +0200 + +python3.3 (3.3.2-6) unstable; urgency=medium + + * Update to 20130917 from the 3.3 branch. + - Fix SSL module to handle NULL bytes inside subjectAltNames general + names (CVE-2013-4238). Closes: #719567. + * Don't run the curses autopkg test. + * Set Multi-Arch attributes for binary packages. + * Fix multiarch include header for sparc64. Closes: #714802. + + -- Matthias Klose Tue, 17 Sep 2013 15:12:00 +0200 + +python3.3 (3.3.2-5) unstable; urgency=low + + * Update to 20130803 from the 3.3 branch. + - Fix fcntl test case on KFreeBSD (Petr Salinger). + * Disable some socket tests on KFreeBSD (Petr Salinger). + * Fix multiarch include header for sparc64. Closes: #714802. + * Update package descriptions (Filipus Klutiero). Closes: #715801. + + -- Matthias Klose Sun, 04 Aug 2013 17:38:35 +0200 + +python3.3 (3.3.2-4) unstable; urgency=low + + * Update to 20130612 from the 3.3 branch. + * Refresh patches. + * Don't run consistency check for cross builds. + * Really skip byte compile of non-existing sitecustomize.py. + * Fix the multiarch header file for mips64 (YunQiang Su). Closes: #710374. + + -- Matthias Klose Wed, 12 Jun 2013 22:55:02 +0200 + +python3.3 (3.3.2-3) unstable; urgency=low + + * Update to 20130527 from the 3.3 branch. + - Fix #17980, possible abuse of ssl.match_hostname() for denial of service + using certificates with many wildcards (CVE-2013-2099). Closes: #708530. + * Disable the test_io test on armel, armhf, mips, mipsel. Hangs the + buildds. + * Don't try to byte-compile sitecustomize.py if the target of the + symlink doesn't exist anymore. Addresses: #709157. + * Fix directory removal in maintainer scripts. Closes: #709963. + * Handle byte compilation in python3.3{-minimal,}, byte removal in + libpython3.3{-minimal,-stdlib}. + * Backport patch to fix issue #13146, possible race conditions when writing + .pyc/.pyo files in py_compile.py (Barry Warsaw). LP: #1058884. + * Mark all _Py_dg_* symbols as optional on m68k. Closes: #709888. + + -- Matthias Klose Mon, 27 May 2013 20:44:03 +0200 + +python3.3 (3.3.2-2) unstable; urgency=high + + * Fix the multiarch header file for ppc64. Closes: #708646. + * Disable running the tests on kfreebsd and the hurd. Please + follow-up in #708652 and #708653. + + -- Matthias Klose Fri, 17 May 2013 23:16:04 +0200 + +python3.3 (3.3.2-1) unstable; urgency=low + + * Python 3.3.2 release. + * Fix sysconfig.get_makefile_name() for the multiarch location. + * Set the platinclude dir back to the non-multiarch include path, + where the multiarch pyconfig.h compatibility header is found. + * Remove obsolete profile-doc patch. + * Run the pgo profile task in batches to avoid crashes during the + pgo profile run. + * Don't set yet any Multi-Arch: attributes in Debian. + * Build a libpython3.3-testsuite package. + * Add autopkg tests to run the installed testsuite in normal and debug + mode. + * Re-enable running the tests during the build. + * Add pyconfig.h compatibility headers. + + -- Matthias Klose Wed, 15 May 2013 19:41:15 +0200 + +python3.3 (3.3.1-1ubuntu5) raring; urgency=low + + * Remove obsolete profile-doc patch. + * Run the pgo profile task in batches to avoid crashes during the + pgo profile run. + * Disable the lto build on armhf for now. + * Final (?) set of autopkg test fixes. + * Issue #17012: shutil.which() no longer fallbacks to the PATH environment. + variable if empty path argument is specified. + * Issue #17782: Fix undefined behaviour on platforms where + ``struct timespec``'s "tv_nsec" member is not a C long. + + -- Matthias Klose Wed, 17 Apr 2013 23:35:49 +0200 + +python3.3 (3.3.1-1ubuntu4) raring; urgency=low + + * Don't run the test suite in random order. + * More autopkg test fixes. + + -- Matthias Klose Wed, 17 Apr 2013 13:33:00 +0200 + +python3.3 (3.3.1-1ubuntu3) raring; urgency=low + + * Fix sysconfig.get_makefile_name() for the multiarch location. + * Set the platinclude dir back to the non-multiarch include path, + where the multiarch pyconfig.h compatibility header is found. + * Fix autopkg tests. + * More autopkgtest fixes (Jean-Baptiste Lallement): + - redirect stderr of command 'stop apport' to /dev/null. output to stderr + is an error for adt. + - script.py waits for child to exit and exit with child's return code. + - xpickle is not a valid value for option -u of regrtest.py. Removed it + LP: #1169150. + * Issue #17754, setting LANG and LC_ALL for the compiler call in ctypes/util. + * Issue #17761, platform._parse_release_file doesn't close the + /etc/lsb-release file, and doesn't know about 'Ubuntu'. + + -- Matthias Klose Tue, 16 Apr 2013 17:33:35 +0200 + +python3.3 (3.3.1-1ubuntu2) raring; urgency=low + + * Idle updates: + - Issue #17657: Show full Tk version in IDLE's about dialog. + - Issue #17613: Prevent traceback when removing syntax colorizer. + - Issue #1207589: Backwards-compatibility patch for right-click menu. + - Issue #16887: Now accepts Cancel in tabify/untabify dialog box. + - Issue #17625: Close the replace dialog after it is used. + - Issue #14254: Now handles readline correctly across shell restarts. + - Issue #17614: No longer raises exception when quickly closing a file. + - Issue #6698: Now opens just an editor window when configured to do so. + - Issue #8900: Using keyboard shortcuts in IDLE to open a file no longer + raises an exception. + - Issue #6649: Fixed missing exit status. + * Build a libpython3.3-testsuite package. LP: #301629. + * Add autopkg tests to run the installed testsuite in normal and debug + mode. + * Re-enable running the tests during the build. + * Add pyconfig.h compatibility headers. LP: #1094246. + + -- Matthias Klose Wed, 10 Apr 2013 23:05:23 +0200 + +python3.3 (3.3.1-1ubuntu1) raring; urgency=low + + * Merge with Debian; remaining changes: + - Build-depend on python3:any instead of python3. + + -- Matthias Klose Sat, 06 Apr 2013 16:21:34 +0200 + +python3.3 (3.3.1-1) unstable; urgency=low + + * Python 3.3.1 release. + * Call python with -E -S for the byte compilation. + + -- Matthias Klose Sat, 06 Apr 2013 15:12:07 +0200 + +python3.3 (3.3.1~rc1-2) experimental; urgency=low + + * Fix byte-compiliation/-removal for the split-out library packages. + LP: #1160944. + + -- Matthias Klose Sat, 30 Mar 2013 13:36:40 +0100 + +python3.3 (3.3.1~rc1-1) experimental; urgency=low + + * Python 3.3.1 release candidate 1. + + -- Matthias Klose Tue, 26 Mar 2013 10:45:37 +0100 + +python3.3 (3.3.0-12) experimental; urgency=low + + * Update to 20130306 from the 3.3 branch. + * Remove the HAVE_FSYNC configure workaround, not needed for 3.3. + * Remove the python3 manual symlink (now shipped upstream by default). + Closes: #701051. + + -- Matthias Klose Wed, 06 Mar 2013 16:38:41 +0800 + +python3.3 (3.3.0-11) experimental; urgency=low + + * Update to 20130220 from the 3.3 branch. + + -- Matthias Klose Wed, 20 Feb 2013 15:40:05 +0100 + +python3.3 (3.3.0-10) experimental; urgency=low + + * Update to 20130126 from the 3.3 branch. + * Update hurd patches. + * python3.3-dbg, libpython3.3-dbg: Drop dependency on python. + * python3.3-dbg: Make gdb (not gdb-minimal) a recommendation. + * Git rid of build-dependency on python. + * Add site-packages in virtual environments created by pyvenv. + Closes: #698777. + + -- Matthias Klose Sat, 26 Jan 2013 12:17:05 +0100 + +python3.3 (3.3.0-9) experimental; urgency=low + + * Update to 20130125 from the 3.3 branch. + * Update cross build patches, and allow the package to cross build. + + -- Matthias Klose Fri, 25 Jan 2013 17:06:25 +0100 + +python3.3 (3.3.0-8) experimental; urgency=low + + * Update to 20130105 from the 3.3 branch. + * python-config --help returns with an exit value 0. LP: #1093860. + * Update package description for the -dbg packages. Closes: #696616. + + -- Matthias Klose Sat, 05 Jan 2013 18:39:32 +0100 + +python3.3 (3.3.0-7) experimental; urgency=low + + * Update to 20121220 from the 3.3 branch. + * debian/patches/sys-multiarch.diff: Expose multiarch triplet value + as sys.implementation._multiarch (Barry Warsaw). Closes: #695959. + Note: Usage of sysconfig.get_config_var('MULTIARCH') is preferred. + * Set the install schema to `unix_prefix', if a virtual environment + is detected (VIRTUAL_ENV env var present). Closes: #695758. + * python3.3-dev, libpython3.3-dev: Drop the dependency on libssl-dev. + + -- Matthias Klose Fri, 21 Dec 2012 07:24:41 +0100 + +python3.3 (3.3.0-6) experimental; urgency=low + + * Don't use xattrs on kfreebsd and the Hurd. + + -- Matthias Klose Tue, 04 Dec 2012 04:36:42 +0100 + +python3.3 (3.3.0-5) experimental; urgency=low + + * Update to 20121203 from the 3.3 branch. + * Make python3.3, python3.3-{minimal,dev,dbg} Multi-Arch: allowed. + * Use a shell implementation for the python-config script. + + -- Matthias Klose Mon, 03 Dec 2012 21:52:33 +0100 + +python3.3 (3.3.0-4) experimental; urgency=low + + * Update to 20121128 from the 3.3 branch. + * Don't link extensions with the shared libpython library. + * Override pointless lintian warning `hardening-no-fortify-functions' + for binaries built without optimization. + + -- Matthias Klose Wed, 28 Nov 2012 13:47:16 +0100 + +python3.3 (3.3.0-3) experimental; urgency=low + + * Update to 20121106 from the 3.3 branch. + * Filter-out cflags for profiled builds from _sysconfigdata. + * Fix multiarch plat-linux installation. LP: #1075891. + * Install _sysconfigdata.py from the shared builds. LP: #1075903. + + -- Matthias Klose Wed, 07 Nov 2012 14:31:02 +0100 + +python3.3 (3.3.0-2) experimental; urgency=low + + * Update to 20121021 from the 3.3 branch. + * Fix the interpreter name for the python3.3-dbg-config script. + + -- Matthias Klose Sun, 21 Oct 2012 09:51:05 +0200 + +python3.3 (3.3.0-1) experimental; urgency=low + + * Python 3.3.0 release. + + -- Matthias Klose Sat, 29 Sep 2012 12:59:24 +0200 + +python3.3 (3.3.0~rc3-1) experimental; urgency=low + + * Python 3.3.0 release candidate 3. + * Don't try to write lib2to3's pickled grammar files. Closes: #687200. + * Fix python-config manpage symlink. Closes: #687201. + + -- Matthias Klose Mon, 24 Sep 2012 16:22:17 +0200 + +python3.3 (3.3.0~rc2-2ubuntu1) quantal; urgency=low + + * Encode the version in the devhelp documentation name. LP: #787039. + + -- Matthias Klose Mon, 10 Sep 2012 12:56:13 +0200 + +python3.3 (3.3.0~rc2-2) experimental; urgency=low + + * Fix typo fixing the pkgconfig file. + + -- Matthias Klose Mon, 10 Sep 2012 11:13:51 +0200 + +python3.3 (3.3.0~rc2-1) experimental; urgency=low + + * Python 3.3.0 release candidate 2. + * Add the platform include dir to pkgconfig's CFlags. + * Hint on installing the python-gdbm package on failing _gdbm import. + LP: #995616. + * libpython3.3: Fix libpython3.3.so symlink. Closes: #686377. + * Don't use `-n' anymore to start idle in the desktop/menu files. + + -- Matthias Klose Sun, 09 Sep 2012 13:38:55 +0200 + +python3.3 (3.3.0~rc1-2) experimental; urgency=low + + * distutils: Add the multiarch python path to the include directories. + Closes: #685041. + * Remove /etc/python3.3 in libpython3.3-minimal instead of python3.3-minimal. + Closes: #681979. + * Remove /etc/python/sysconfig.cfg, not available anymore in python3.3. + Closes: #685016. + * Don't ship the _gdbm and _tkinter extensions in the -dbg package. + Closes: #685261. + * Fix verbose parallel builds for the sharedmods target. + * Don't install the pickled lib2to3 grammar files. Closes: #685214. + * Build extensions with fortify flags. + * Overwrite arch-dependent-file-not-in-arch-specific-directory warnings. + + -- Matthias Klose Tue, 28 Aug 2012 19:47:58 +0200 + +python3.3 (3.3.0~rc1-1) experimental; urgency=low + + * Python 3.3.0 release candidate 1. + + -- Matthias Klose Sun, 26 Aug 2012 23:15:00 +0200 + +python3.3 (3.3.0~b2-1) experimental; urgency=low + + * Python 3.3.0 beta2 release. + * Fix removal of the _tkinter and dbm extensions for multiarch builds. + Closes: #684461. + * Use _sysconfigdata.py in distutils to initialize distutils. + Closes: #682475. + * Fix symlink for static libpython. Closes: #684608. + + -- Matthias Klose Mon, 13 Aug 2012 11:05:00 +0200 + +python3.3 (3.3.0~b1-3) experimental; urgency=low + + * Update to 20120712 from the trunk. + * Install separate _sysconfigdata.py for normal and debug builds. + * Install into multiarch locations. + * Split out multiarch packages libpython3.3-{minimal,stdlib,dev,dbg}. + + -- Matthias Klose Fri, 13 Jul 2012 00:43:42 +0200 + +python3.3 (3.3.0~b1-2) experimental; urgency=low + + * Update to 20120701 from the trunk. + + -- Matthias Klose Sun, 01 Jul 2012 11:45:12 +0200 + +python3.3 (3.3.0~b1-1) experimental; urgency=low + + * Python 3.3.0 beta1 release. + * Fix symlink for the -gdb.py file. + * debian/copyright: Add libmpdec license. + * Enable fortified build. + + -- Matthias Klose Wed, 27 Jun 2012 08:44:56 +0200 + +python3.3 (3.3.0~a4-1) experimental; urgency=low + + * Python 3.3.0 alpha4 release. + * Update to 20120620 from the trunk. + * Build _ctypes as an extension, not a builtin. + * Mark symbols defined in the _ctypes extension as optional. + * Remove references to the removed pyton3.3-documenting file. + * The wininst-* files cannot be built within Debian from the included + sources, needing a zlib mingw build, which the zlib maintainer isn't + going to provide. + * Use the underscore.js file provided by the libjs-underscore package. + * Let pydoc handle dist-packages the same as site-packages. + * Avoid runtime path for the sqlite extension. + + -- Matthias Klose Wed, 20 Jun 2012 13:09:19 +0200 + +python3.3 (3.3.0~a3-1) experimental; urgency=low + + * Python 3.3.0 alpha3 release. + * Build the dbm extension using db5.3. + * Update symbols file for a3. + + -- Matthias Klose Wed, 02 May 2012 23:28:46 +0200 + +python3.3 (3.3.0~a2-1) experimental; urgency=low + + * Python 3.3.0 alpha2 release. + * Update to 20120404 from the trunk. + * Build-depend on expat (>= 2.1). + + -- Matthias Klose Wed, 04 Apr 2012 16:31:34 +0200 + +python3.3 (3.3.0~a1-1) experimental; urgency=low + + * Python 3.3.0 alpha1 release. + * Update to 20120321 from the trunk. + * Update debian/copyright. + * Build-depend on expat (>= 2.1~). + + -- Matthias Klose Thu, 22 Mar 2012 06:14:01 +0100 + +python3.3 (3.3~20120109-1) experimental; urgency=low + + * 3.3 20120109 snapshot from the trunk. + * Merge packaging from python3.2 3.2.2-4. + + -- Matthias Klose Sun, 08 Jan 2012 09:44:33 +0100 + +python3.3 (3.3~20110523-1) experimental; urgency=low + + * Initial Python 3.3 packaging. + + -- Matthias Klose Mon, 23 May 2011 09:20:52 +0200 + +python3.2 (3.2.2-4) unstable; urgency=low + + * The static library belongs into the -dev package. + * Remove obsolete attributes in the control file. + + -- Matthias Klose Sat, 07 Jan 2012 20:46:39 +0100 + +python3.2 (3.2.2-3) unstable; urgency=low + + * Update to 20120106 from the 3.2 branch. + * Install manual pages for 2to3 and python-config. + * Fix file permission of token.py module. + * Add the ability to build an python3.x udeb, as copy of the + python3.x-minimal package (Colin Watson). + * Overwrite some lintian warnings: + - The -dbg interpreters are not unusual. + - The -gdb.py files don't need a python dependency. + - lintian can't handle a whatis entry starting with one word on the line. + * Fix test failures related to distutils debian installation layout. + * Update symbols files. + * Add build-arch/build-indep targets. + * Regenerate Setup and Makefiles after correcting Setup.local. + * profiled-build.diff: Pass PY_CFLAGS instead of CFLAGS for the profiled + build. + * Pass dpkg-buildflags to the build process, and build third party + extensions with these flags. + * Add support to build using -flto (and -g1) on some architectures. + * Disable pgo builds for some architectures (for now, keep just + amd64 armel armhf i386 powerpc ppc64). + * Build-depend on libgdbm-dev to build and run the gdbm tests. + * Build-depend on xvfb to run the tkinter tests. + + -- Matthias Klose Fri, 06 Jan 2012 20:10:13 +0100 + +python3.2 (3.2.2-2) unstable; urgency=low + + * Update platform patches (alpha, hppa, mips, sparc). + + -- Matthias Klose Fri, 02 Dec 2011 10:24:05 +0100 + +python3.2 (3.2.2-1) unstable; urgency=low + + * Python 3.2.2 release. + * Update to 20111201 from the 3.2 branch. + * Search headers in /usr/include/ncursesw for the curses/panel extensions. + * New patch, ctypes-arm, allow for ",hard-float" after libc6 in ldconfig -p + output (Loic Minier). LP: #898172. + + -- Matthias Klose Thu, 01 Dec 2011 13:19:16 +0100 + +python3.2 (3.2.2~rc1-1) unstable; urgency=low + + * Python 3.2.2 release candidate 1. + + -- Matthias Klose Sun, 14 Aug 2011 20:25:35 +0200 + +python3.2 (3.2.1-2) unstable; urgency=low + + * Update to 20110803 from the 3.2 branch. + * Revert previous change to treat Linux 3.x as Linux 2. Use the + plat-linux3 directory instead. + * Use linux-any for some build dependencies. Closes: #634310. + + -- Matthias Klose Wed, 03 Aug 2011 15:16:05 +0200 + +python3.2 (3.2.1-1) unstable; urgency=medium + + * Python 3.2.1 release. + * Update lib-argparse patch (Pino Toscano). Closes: #631635. + * Treat Linux 3.x as Linux 2. Closes: #633015. + + -- Matthias Klose Sun, 10 Jul 2011 21:46:36 +0200 + +python3.2 (3.2.1~rc2-1) unstable; urgency=low + + * Python 3.2.1 release candidate 2. + * Add profile/pstats to the python3.2 package, update debian copyright. + * Don't run the benchmark on hurd-i386. + * Disable threading tests on hurd-i386. Closes: #631634. + * Don't add the bsddb multilib path, if already in the standard lib path. + + -- Matthias Klose Mon, 04 Jul 2011 20:27:52 +0200 + +python3.2 (3.2.1~rc1-1) unstable; urgency=low + + * Python 3.2.1 release candidate 1. + * Only enable sphinx-0.x patches when building with sphinx-0.x. + + -- Matthias Klose Wed, 18 May 2011 12:15:47 +0200 + +python3.2 (3.2-4) unstable; urgency=low + + * Update to 20110504 from the 3.2 branch. + * Disable the profiled build on ia64 and m68k. + * Update symbols file for m68k (Thorsten Glaser). + + -- Matthias Klose Wed, 04 May 2011 21:32:08 +0200 + +python3.2 (3.2-3) unstable; urgency=low + + * Update to 20110427 from the 3.2 branch. + - Fix argparse import. Closes: #624277. + * Keep the ssl.PROTOCOL_SSLv2 module constant , just raise an exception + when trying to create a PySSL object. #624127. + * Don't depend on the locale and specific awk implementations in prerm. + Closes: #623466, #620836. + * Remove the old local site directory. Closes: #623057. + + -- Matthias Klose Wed, 27 Apr 2011 20:40:29 +0200 + +python3.2 (3.2-2) unstable; urgency=low + + * Update to 20110419 from the 3.2 branch. + * Re-enable profile-guided builds. + * Build without OpenSSL v2 support. Closes: #622004. + * Force linking the curses module against libncursesw. Closes: #622064. + * Re-enable running the testsuite during the build. + + -- Matthias Klose Tue, 19 Apr 2011 17:54:36 +0200 + +python3.2 (3.2-1) unstable; urgency=low + + * Python 3.2 final release. + + -- Matthias Klose Sun, 20 Feb 2011 19:22:24 +0100 + +python3.2 (3.2~rc3-1) experimental; urgency=low + + * Python 3.2 release candidate 3. + + -- Matthias Klose Mon, 14 Feb 2011 16:12:14 +0100 + +python3.2 (3.2~rc1-2) experimental; urgency=low + + * Fix upgrade of the python3.2-dev package. Closes: #610370. + + -- Matthias Klose Wed, 19 Jan 2011 02:21:19 +0100 + +python3.2 (3.2~rc1-1) experimental; urgency=low + + * Python 3.2 release candidate 1. + + -- Matthias Klose Sun, 16 Jan 2011 22:17:09 +0100 + +python3.2 (3.2~b2-1) experimental; urgency=low + + * Python 3.2 beta2 release. + * Fix FTBFS on hurd-i386 (Pino Toscano). Closes: #606152). + + -- Matthias Klose Tue, 21 Dec 2010 21:23:21 +0100 + +python3.2 (3.2~b1-1) experimental; urgency=low + + * Python 3.2 beta1 release. + * Configure with --enable-loadable-sqlite-extensions. + + -- Matthias Klose Mon, 06 Dec 2010 12:19:09 +0100 + +python3.2 (3.2~a4-2) experimental; urgency=low + + * Fix build failure on the hurd. + + -- Matthias Klose Fri, 26 Nov 2010 06:38:41 +0100 + +python3.2 (3.2~a4-1) experimental; urgency=low + + * Python 3.2 alpha4 release. + * Update to the py3k branch (20101124). + * Move the Makefile into the -min package, required by sysconfig. + Addresses: #603237. + + -- Matthias Klose Wed, 24 Nov 2010 22:20:32 +0100 + +python3.2 (3.2~a3-2) experimental; urgency=low + + * Update to the py3k branch (20101018). + - Issue #10094: Use versioned .so files on GNU/kfreeBSD and the GNU Hurd. + Closes: #600183. + + -- Matthias Klose Mon, 18 Oct 2010 19:34:39 +0200 + +python3.2 (3.2~a3-1) experimental; urgency=low + + * Python 3.2 alpha3 release. + * Make Lib/plat-gnukfreebsd[78] ready for python3. Closes: #597874. + + -- Matthias Klose Tue, 12 Oct 2010 16:13:15 +0200 + +python3.2 (3.2~a2-7) experimental; urgency=low + + * Update to the py3k branch (20100926). + + -- Matthias Klose Sun, 26 Sep 2010 14:41:18 +0200 + +python3.2 (3.2~a2-6) experimental; urgency=low + + * Update to the py3k branch (20100919). + * Update GNU/Hurd patches (Pino Toscano). Closes: #597320. + + -- Matthias Klose Sun, 19 Sep 2010 12:45:14 +0200 + +python3.2 (3.2~a2-5) experimental; urgency=low + + * Update to the py3k branch (20100916). + * Provide Lib/plat-gnukfreebsd[78] (Jakub Wilk). Addresses: #593818. + * Assume working semaphores, don't rely on running kernel for the check. + LP: #630511. + + -- Matthias Klose Thu, 16 Sep 2010 14:41:58 +0200 + +python3.2 (3.2~a2-4) experimental; urgency=low + + * Update to the py3k branch (20100911). + * Add the sysconfig module to python3.2-minimal. + * Remove dist-packages/README. + * Make xargs --show-limits in the maintainer scripts independent from + the locale. + + -- Matthias Klose Sat, 11 Sep 2010 20:59:47 +0200 + +python3.2 (3.2~a2-3) experimental; urgency=low + + * Update to the py3k branch (20100910). + * Disable profile feedback based optimization on armel. + * Add copyright information for expat, libffi and zlib. Sources + for the wininst-* files are in PC/bdist_wininst. Closes: #596276. + * Run the testsuite in parallel, when parallel= is set in DEB_BUILD_OPTIONS. + + -- Matthias Klose Fri, 10 Sep 2010 20:28:16 +0200 + +python3.2 (3.2~a2-2) experimental; urgency=low + + * Fix distutils.sysconfig.get_makefile_name for debug builds. + + -- Matthias Klose Thu, 09 Sep 2010 02:40:11 +0200 + +python3.2 (3.2~a2-1) experimental; urgency=low + + * Python 3.2 alpha2 release. + * Update to the py3k branch (20100908). + * Provide /usr/lib/python3/dist-packages as location for public python + packages. + + -- Matthias Klose Wed, 08 Sep 2010 17:36:06 +0200 + +python3.2 (3.2~a1-1) experimental; urgency=low + + * Python 3.2 alpha1 release. + - Files removed: Lib/profile.py, Lib/pstats.py, PC/icons/source.xar. + * Update to the py3k branch (20100827). + * Fix detection of ffi.h header file. Closes: #591408. + * python3.1-dev: Depend on libssl-dev. LP: #611845. + + -- Matthias Klose Fri, 27 Aug 2010 21:40:31 +0200 + +python3.2 (3.2~~20100707-0ubuntu1) maverick; urgency=low + + * Move the pkgconfig file into the -dev package. + * Update preremoval scripts for __pycache__ layout. + * Run hooks from /usr/share/python3/runtime.d/ + * Update distutils-install-layout and debug-build patches. + + -- Matthias Klose Wed, 07 Jul 2010 12:38:52 +0200 + +python3.2 (3.2~~20100706-0ubuntu1) maverick; urgency=low + + * Test build, taken from the py3k branch (20100706). + * Merge with the python3.1 packaging. + + -- Matthias Klose Tue, 06 Jul 2010 17:10:51 +0200 + +python3.2 (3.2~~20100704-0ubuntu1) maverick; urgency=low + + * Test build, taken from the py3k branch (20100704). + + -- Matthias Klose Sun, 04 Jul 2010 16:04:45 +0200 + +python3.2 (3.2~~20100421-0ubuntu1) lucid; urgency=low + + * Test build, taken from the py3k branch (20100421). + + -- Matthias Klose Wed, 21 Apr 2010 22:04:14 +0200 + +python3.1 (3.1.2+20100703-1) unstable; urgency=low + + * Update to the 3.1 release branch, 20100703. + * Convert internal dpatch system to quilt. + * Update module list for python3-minimal. + + -- Matthias Klose Sat, 03 Jul 2010 14:18:18 +0200 + +python3.1 (3.1.2-3) unstable; urgency=low + + * Update to the 3.1 release branch, 20100508. + * Fix backport of issue #8140. Closes: #578896. + + -- Matthias Klose Sat, 08 May 2010 15:37:35 +0200 + +python3.1 (3.1.2-2) unstable; urgency=low + + * Update to the 3.1 release branch, 20100421. + * Update patch for issue #8032, gdb7 hooks for debugging. + * Fix issue #8233: When run as a script, py_compile.py optionally + takes a single argument `-`. + * Don't build-depend on locales on avr32. + + -- Matthias Klose Wed, 21 Apr 2010 21:12:37 +0200 + +python3.1 (3.1.2-1) unstable; urgency=low + + * Python 3.1.2 release. + * Fix issue #4961: Inconsistent/wrong result of askyesno function in + tkMessageBox with Tcl8.5. LP: #462950. + * Don't complain when /usr/local is not writable on installation. + * Apply proposed patch for issue #8032, gdb7 hooks for debugging. + + -- Matthias Klose Sun, 21 Mar 2010 17:59:49 +0100 + +python3.1 (3.1.2~rc1-2) unstable; urgency=low + + * Update to the 3.1 release branch, 20100316. + * Backport issue #8140: Extend compileall to compile single files. + Add -i option. + + -- Matthias Klose Tue, 16 Mar 2010 02:38:45 +0100 + +python3.1 (3.1.2~rc1-1) unstable; urgency=low + + * Python 3.1.2 release candidate 1. + - Replace the Monty Python audio test file. Closes: #568676. + * Build using libdb4.8-dev. Only used for the dbm extension; the bsddb3 + extension isn't built from the core packages anymore. + + -- Matthias Klose Thu, 11 Mar 2010 17:26:17 +0100 + +python3.1 (3.1.1-3) unstable; urgency=low + + * Update to the 3.1 release branch, 20100119. + * Hurd fixes (Pino Toscano): + - hurd-broken-poll.dpatch: ported from 2.5. + - hurd-disable-nonworking-constants.dpatch: disable a few constants from + the public API whose C counterparts are not implemented, so using them + either always blocks or always fails (caused issues in the test suite). + - hurd-path_max.dpatch (hurd only): change few PATH_MAX occurrences to + MAXPATHLEN (which is defined by the python lib if not defined by the OS). + - cthreads.dpatch: Refresh. + - Exclude the profiled build for hurd. + - Disable six blocking tests from the test suite. + * Don't run the testsuite on armel and hppa until someone figures out + the blocking tests. + + -- Matthias Klose Tue, 19 Jan 2010 22:02:14 +0100 + +python3.1 (3.1.1-2) unstable; urgency=low + + * Update to the 3.1 release branch, 20100116. + * Fix bashism in makesetup shell script. Closes: #530170, #530171. + * Fix build issues on avr (Bradley Smith). Closes: #528439. + - Configure --without-ffi. + - Don't run lengthly tests. + + -- Matthias Klose Sat, 16 Jan 2010 23:28:05 +0100 + +python3.1 (3.1.1-1) experimental; urgency=low + + * Python 3.1.1 final release. + * Update to the 3.1 release branch, 20091011. + * Remove /usr/local/lib/python3.1 on package removal, if empty. + * Build _hashlib as a builtin. LP: #445530. + * python3.1-doc: Don't compress the sphinx inventory. + * python3.1-doc: Fix jquery.js symlink. LP: #447370. + * Run the benchmark with -C 2 -n 5 -w 4 on all architectures. + * python3.1-dbg: Don't create debug subdirectory in /usr/local. No + separate debug directory needed anymore. + * Fix title of devhelp document. LP: #423551. + + -- Matthias Klose Sun, 11 Oct 2009 22:01:57 +0200 + +python3.1 (3.1-1) experimental; urgency=low + + * Python 3.1 final release. + * Update to the 3.1 release branch, 20090723. + * Add explicit build dependency on tk8.5-dev. + + -- Matthias Klose Thu, 23 Jul 2009 15:20:35 +0200 + +python3.1 (3.1-0ubuntu2) karmic; urgency=low + + * Disable profile feedback based optimization on amd64 (GCC + PR gcov-profile/38292). + + -- Matthias Klose Fri, 24 Jul 2009 16:27:22 +0200 + +python3.1 (3.1-0ubuntu1) karmic; urgency=low + + * Python 3.1 final release. + * Update to the 3.1 release branch, 20090723. + * Add explicit build dependency on tk8.5-dev. + + -- Matthias Klose Thu, 23 Jul 2009 18:52:17 +0200 + +python3.1 (3.1~rc2+20090622-1) experimental; urgency=low + + [Matthias Klose] + * Python 3.1 rc2 release. Closes: #529320. + * Update to the trunk, 20090622, remove patches integrated upstream. + * Configure with --with-fpectl --with-dbmliborder=bdb --with-wide-unicode. + NOTE: The --with-wide-unicode configuration will break most extensions + built with 3.1~a1, but is consistent with python2.x configurations. + * Add symbols files for libpython3.1 and python3.1-dbg, don't include symbols + from builtins, which can either be built as builtins or extensions. + * Keep an empty lib-dynload in python3.1-minimal to avoid a warning on + startup. + * python3.1-doc: Depend on libjs-jquery, use jquery.js from this package. + Closes: #523485. + * Do not add /usr/lib/pythonXY.zip on sys.path. + * Add symbols files for libpython3.1 and python3.1-dbg, don't include symbols + from builtins, which can either be built as builtins or extensions. + * Keep an empty lib-dynload in python3.1-minimal to avoid a warning on + startup. + * Fix some lintian warnings. + * Use the information in /etc/lsb-release for platform.dist(). LP: #196526. + * Move the bdist_wininst files into the -dev package (only needed to build + windows installers). + * Document changes to the site directory name in the installation manual. + * Don't build a profiled binary. Closes: #521811. + + * Address issues when working with PYTHONUSERBASE and non standard prefix + (pointed out by Larry Hastings): + - distutils.sysconfig.get_python_lib(): Only return ".../dist-packages" if + prefix is the default prefix and if PYTHONUSERBASE is not set in the + environment. + - site.addusersitepackages(): Add USER_BASE/.../dist-packages to sys.path. + * Always use the `unix_prefix' scheme for setup.py install in a virtualenv + setup. LP: #339904. + * Don't make the setup.py install options --install-layout=deb and --prefix + conflict with each other. + * distutils: Always install into `/usr/local/lib/python3.1/dist-packages' + if an option `--prefix=/usr/local' is present (except for virtualenv + and PYTHONUSERBASE installations). LP: #362570. + * Always use `site-packages' as site directory name in virtualenv. + + [Marc Deslauriers] + * debian/pyhtml2devhelp.py: update for sphinx generated documentation. + * debian/rules: re-enable documentation files for devhelp. + + -- Matthias Klose Mon, 22 Jun 2009 16:18:39 +0200 + +python3.1 (3.1~a1+20090322-1) experimental; urgency=low + + * Python 3.1 alpha1 release. + * Update to the trunk, 20090322. + * Update installation schemes: LP: #338395. + - When the --prefix option is used for setup.py install, Use the + `unix_prefix' scheme. + - Use the `deb_system' scheme if --install-layout=deb is specified. + - Use the the `unix_local' scheme if neither --install-layout=deb + nor --prefix is specified. + * Use the information in /etc/lsb-release for platform.dist(). LP: #196526. + * pydoc: Fix detection of local documentation files. + * Build a shared library configured --with-pydebug. LP: #322580. + * Fix some lintian warnings. + + -- Matthias Klose Mon, 23 Mar 2009 00:01:27 +0100 + +python3.1 (3.1~~20090226-1) experimental; urgency=low + + * Python-3.1 snapshot (20090226), upload to experimental. + + -- Matthias Klose Thu, 26 Feb 2009 16:18:41 +0100 + +python3.1 (3.1~~20090222-0ubuntu1) jaunty; urgency=low + + * Python-3.1 snapshot (20090222). + * Build the _dbm extension using the Berkeley DB backend. + + -- Matthias Klose Sun, 22 Feb 2009 12:58:58 +0100 + +python3.0 (3.0.1-0ubuntu4) jaunty; urgency=low + + * Don't build-depend on locales on sparc. Currently not installable. + + -- Matthias Klose Sun, 22 Feb 2009 12:48:38 +0100 + +python3.0 (3.0.1-0ubuntu3) jaunty; urgency=low + + * Update to 20090222 from the release30-maint branch. + + -- Matthias Klose Sun, 22 Feb 2009 11:09:58 +0100 + +python3.0 (3.0.1-0ubuntu2) jaunty; urgency=low + + * Allow docs to be built with Sphinx 0.5.x. + + -- Matthias Klose Tue, 17 Feb 2009 12:58:02 +0100 + +python3.0 (3.0.1-0ubuntu1) jaunty; urgency=low + + * New upstream version. + + -- Matthias Klose Mon, 16 Feb 2009 17:18:23 +0100 + +python3.0 (3.0-0ubuntu2) jaunty; urgency=low + + * Update to 20090213 from the release30-maint branch. + + -- Matthias Klose Fri, 13 Feb 2009 15:49:12 +0100 + +python3.0 (3.0-0ubuntu1) jaunty; urgency=low + + * Final Python-3.0 release. + + -- Matthias Klose Thu, 04 Dec 2008 09:00:09 +0100 + +python3.0 (3.0~rc3-0ubuntu4) jaunty; urgency=low + + * Update to 20081127 from the py3k branch. + * Ensure that all extensions from the -minimal package are statically + linked into the interpreter. LP: #301597. + * Include expat, _elementtree, datetime in -minimal to link + these extensions statically. + + -- Matthias Klose Thu, 27 Nov 2008 08:49:02 +0100 + +python3.0 (3.0~rc3-0ubuntu3) jaunty; urgency=low + + * Ignore errors when running the profile task. + + -- Matthias Klose Sun, 23 Nov 2008 15:50:17 +0100 + +python3.0 (3.0~rc3-0ubuntu2) jaunty; urgency=low + + * Don't run test_ioctl on the buildd, before the buildd chroot is fixed: + Unable to open /dev/tty. + + -- Matthias Klose Sun, 23 Nov 2008 15:28:02 +0100 + +python3.0 (3.0~rc3-0ubuntu1) jaunty; urgency=low + + * Update to the python-3.0 release candidate 3. + + -- Matthias Klose Sun, 23 Nov 2008 13:14:20 +0100 + +python3.0 (3.0~rc1+20081027-0ubuntu1) intrepid; urgency=low + + * Update to 20081027 from the py3k branch. LP: #279227. + * Fix typos and section names in doc-base files. LP: #273344. + * Build a new package libpython3.0. + * For locally installed packages, create a directory + /usr/local/lib/python3.0/dist-packages. This is the default for + installations done with distutils and setuptools. Third party stuff + packaged within the distribution goes to /usr/lib/python3.0/dist-packages. + There is no /usr/lib/python3.0/site-packages in the file system and + on sys.path. No package within the distribution must not install + anything in this location. + * distutils: Add an option --install-layout=deb, which + - installs into $prefix/dist-packages instead of $prefix/site-packages. + - doesn't encode the python version into the egg name. + + -- Matthias Klose Mon, 27 Oct 2008 23:38:42 +0100 + +python3.0 (3.0~b3+20080915-0ubuntu1) intrepid; urgency=low + + * Update to 20080915 from the py3k branch. + * Build gdbm + + -- Matthias Klose Mon, 15 Sep 2008 23:56:44 +0200 + +python3.0 (3.0~b3-0ubuntu1~ppa1) intrepid; urgency=low + + * Python 3.0 beta3 release. + + -- Matthias Klose Sun, 24 Aug 2008 03:49:26 +0200 + +python3.0 (3.0~b2-0ubuntu1~ppa1) intrepid; urgency=low + + * Python 3.0 beta2 release. + + -- Matthias Klose Thu, 07 Aug 2008 14:57:02 +0000 + +python3.0 (3.0~b1-0ubuntu1~ppa1) intrepid; urgency=low + + * Python 3.0 beta1 release. + + -- Matthias Klose Tue, 15 Jul 2008 16:10:52 +0200 + +python3.0 (3.0~a5+0530-0ubuntu1) intrepid; urgency=low + + * Update to snapshot taken from the py3k branch. + + -- Matthias Klose Thu, 29 May 2008 15:50:55 +0200 + +python3.0 (3.0~a1-0ubuntu2) gutsy; urgency=low + + * Disable running the benchmark. + + -- Matthias Klose Fri, 31 Aug 2007 23:22:34 +0000 + +python3.0 (3.0~a1-0ubuntu1) gutsy; urgency=low + + * First Python-3.0 alpha release. + + -- Matthias Klose Fri, 31 Aug 2007 21:26:21 +0200 + +python2.6 (2.6~alpha~pre1-~0ubuntu1~ppa1) gutsy; urgency=low + + * Snapshot build, an "how to use tilde in version numbers" upload. + * SVN 20070831. + + -- Matthias Klose Fri, 31 Aug 2007 15:56:09 +0200 + +python2.5 (2.5.2-4) unstable; urgency=low + + * Update to 20080427, taken from the 2.5 release branch. + - Fix issues #2670, #2682. + * Disable running pybench on the hppa buildd (ftbfs). + * Allow setting BASECFLAGS, OPT and EXTRA_LDFLAGS (like, CC, CXX, CPP, + CFLAGS, CPPFLAGS, CCSHARED, LDSHARED) from the environment. + * Support parallel= in DEB_BUILD_OPTIONS (see #209008). + + -- Matthias Klose Sun, 27 Apr 2008 10:40:51 +0200 + +python2.5 (2.5.2-3) unstable; urgency=medium + + * Update to 20080416, taken from the 2.5 release branch. + - Fix CVE-2008-1721, integer signedness error in the zlib extension module. + - Fix urllib2 file descriptor happens byte-at-a-time, reverting + a fix for excessively large memory allocations when calling .read() + on a socket object wrapped with makefile(). + * Disable some regression tests on some architectures: + - arm: test_compiler, test_ctypes. + - armel: test_compiler. + - hppa: test_fork1, test_wait3. + - m68k: test_bsddb3, test_compiler. + * Build-depend on libffi-dev instead of libffi4-dev. + * Fix CVE-2008-1679, integer overflows in the imageop module. + + -- Matthias Klose Wed, 16 Apr 2008 23:37:46 +0200 + +python2.5 (2.5.2-2) unstable; urgency=low + + * Use site.addsitedir() to add directories in /usr/local to sys.path. + Addresses: #469157, #469818. + + -- Matthias Klose Sat, 08 Mar 2008 16:11:23 +0100 + +python2.5 (2.5.2-1) unstable; urgency=low + + * Python 2.5.2 release. + * Merge from Ubuntu: + - Move site customization into sitecustomize.py, don't make site.py + a config file. Addresses: #309719, #413172, #457361. + - Move site.py to python2.4-minimal, remove `addbuilddir' from site.py, + which is unnecessary for installed builds. + - python2.5-dev: Recommend libc-dev instead of suggesting it. LP: #164909. + - Fix issue 961805, Tk Text.edit_modified() fails. LP: #84720. + + -- Matthias Klose Thu, 28 Feb 2008 23:18:52 +0100 + +python2.5 (2.5.1-7) unstable; urgency=low + + * Update to 20080209, taken from the 2.5 release branch. + * Build the _bsddb extension with db-4.5 again; 4.6 is seriously + broken when used with the _bsddb extension. + * Do not run pybench on arm and armel. + * python2.5: Provide python2.5-wsgiref. + * Fix a pseudo RC report with duplicated attributes in the control + file. Closes: #464307. + + -- Matthias Klose Sun, 10 Feb 2008 00:22:57 +0100 + +python2.5 (2.5.1-6) unstable; urgency=low + + * Update to 20080102, taken from the 2.5 release branch. + - Only define _BSD_SOURCE on OpenBSD systems. Closes: #455400. + * Fix handling of packages in linecache.py (Kevin Goodsell). LP: #70902. + * Bump debhelper to v5. + * Register binfmt for .py[co] files. + * Use absolute paths when byte-compiling files. Addresses: #453346. + Closes: #413566, LP: #177722. + * CVE-2007-4965, http://bugs.python.org/issue1179: + Multiple integer overflows in the imageop module in Python 2.5.1 and + earlier allow context-dependent attackers to cause a denial of service + (application crash) and possibly obtain sensitive information (memory + contents) via crafted arguments to (1) the tovideo method, and unspecified + other vectors related to (2) imageop.c, (3) rbgimgmodule.c, and other + files, which trigger heap-based buffer overflows. + Patch prepared by Stephan Herrmann. Closes: #443333, LP: #163845. + * Register info docs when doing source only uploads. LP: #174786. + * Remove deprecated value from categories in desktop file. LP: #172874. + * python2.5-dbg: Don't include the gdbm and _tkinter extensions, now provided + in separate packages. + * Provide a symlink changelog -> NEWS. Closes: #439271. + * Fix build failure on hurd, working around poll() on systems on which it + returns an error on invalid FDs. Closes: #438914. + * Configure --with-system-ffi on all architectures. Closes: #448520. + * Fix version numbers in copyright and README files (Dan O'Huiginn). + Closes: #446682. + * Move some documents from python2.5 to python2.5-dev. + + -- Matthias Klose Wed, 02 Jan 2008 22:22:19 +0100 + +python2.5 (2.5.1-5) unstable; urgency=low + + * Build the _bsddb extension with db-4.6. + + -- Matthias Klose Fri, 17 Aug 2007 00:39:35 +0200 + +python2.5 (2.5.1-4) unstable; urgency=low + + * Update to 20070813, taken from the 2.5 release branch. + * Include plat-mac/plistlib.py (plat-mac is not in sys.path by default. + Closes: #435826. + * Use emacs22 to build the documentation in info format. Closes: #434969. + * Build-depend on db-dev (>= 4.6). Closes: #434965. + + -- Matthias Klose Mon, 13 Aug 2007 22:22:44 +0200 + +python2.5 (2.5.1-3) unstable; urgency=high + + * Support mixed-endian IEEE floating point, as found in the ARM old-ABI + (Aurelien Jarno). Closes: #434905. + + -- Matthias Klose Fri, 27 Jul 2007 20:01:35 +0200 + +python2.5 (2.5.1-2) unstable; urgency=low + + * Update to 20070717, taken from the 2.5 release branch. + * Fix reference count for sys.pydebug variable. Addresses: #431393. + * Build depend on libbluetooth-dev instead of libbluetooth2-dev. + + -- Matthias Klose Tue, 17 Jul 2007 14:09:47 +0200 + +python2.5 (2.5.1-1) unstable; urgency=low + + * Python-2.5.1 release. + * Build-depend on gcc-4.1 (>= 4.1.2-4) on alpha, powerpc, s390, sparc. + * Merge from Ubuntu: + - Add debian/patches/subprocess-eintr-safety.dpatch (LP: #87292): + - Create and use wrappers around read(), write(), and os.waitpid() in the + subprocess module which retry the operation on an EINTR (which happens + if e. g. an alarm was raised while the system call was in progress). + It is incredibly hard and inconvenient to sensibly handle this in + applications, so let's fix this at the right level. + - Patch based on original proposal of Peter <85>strand + in http://python.org/sf/1068268. + - Add two test cases. + - Change the interpreter to build and install python extensions + built with the python-dbg interpreter with a different name into + the same path (by appending `_d' to the extension name). The debug build + of the interpreter tries to first load a foo_d.so or foomodule_d.so + extension, then tries again with the normal name. + - When trying to import the profile and pstats modules, don't + exit, add a hint to the exception pointing to the python-profiler + package, don't exit. + - Keep the module version in the .egg-info name, only remove the + python version. + - python2.5-dbg: Install Misc/SpecialBuilds.txt, document the + debug changes in README.debug. + * Update to 20070425, taken from the 2.5 release branch. + + -- Matthias Klose Wed, 25 Apr 2007 22:12:50 +0200 + +python2.5 (2.5-6) unstable; urgency=medium + + * webbrowser.py: Recognize other browsers: www-browser, x-www-browser, + iceweasel, iceape. + * Move pyconfig.h from the python2.5-dev into the python2.5 package; + required by builds for pure python modules without having python2.5-dev + installed (matching the functionality in python2.4). + * Move the unicodedata module into python2.5-minimal; allows byte compilation + of UTF8 encoded files. + * Do not install anymore outdated debhelper sample scripts. + * Install Misc/SpecialBuilds.txt as python2.5-dbg document. + + -- Matthias Klose Wed, 21 Feb 2007 01:17:12 +0100 + +python2.5 (2.5-5) unstable; urgency=high + + * Do not run the python benchmark on m68k. Timer problems. + Fixes FTBFS on m68k. + * Update to 20061209, taken from the 2.5 release branch. + - Fixes building the library reference in info format. + + -- Matthias Klose Sat, 9 Dec 2006 13:40:48 +0100 + +python2.5 (2.5-4) unstable; urgency=medium + + * Update to 20061203, taken from the 2.5 release branch. + - Fixes build failures on knetfreebsd and the hurd. Closes: #397000. + * Clarify README about distutils. Closes: #396394. + * Move python2.5-config to python2.5-dev. Closes: #401451. + * Cleanup build-conflicts. Addresses: #394512. + + -- Matthias Klose Sun, 3 Dec 2006 18:22:49 +0100 + +python2.5 (2.5-3.1) unstable; urgency=low + + * Non-maintainer upload. + * python2.5-minimal depends on python-minimal (>= 2.4.4-1) because it's the + first version which lists python2.5 as an unsupported runtime (ie a + runtime that is available but for which modules are not auto-compiled). + And being listed there is required for python-central to accept the + installation of python2.5-minimal. Closes: #397006 + + -- Raphael Hertzog Wed, 22 Nov 2006 15:41:06 +0100 + +python2.5 (2.5-3) unstable; urgency=medium + + * Update to 20061029 (2.4.4 was released on 20061019), taken from + the 2.5 release branch. We do not want to have regressions in + 2.5 compared to the 2.4.4 release. + * Don't run pybench on m68k, fails in the calibration loop. Closes: #391030. + * Run the installation/removal hooks. Closes: #383292, #391036. + + -- Matthias Klose Sun, 29 Oct 2006 11:35:19 +0100 + +python2.5 (2.5-2) unstable; urgency=medium + + * Update to 20061003, taken from the 2.5 release branch. + * On arm and m68k, don't run the pybench in debug mode. + * Fix building the source within exec_prefix (Alexander Wirt). + Closes: #385336. + + -- Matthias Klose Tue, 3 Oct 2006 10:08:36 +0200 + +python2.5 (2.5-1) unstable; urgency=low + + * Python 2.5 release. + * Update to 20060926, taken from the 2.5 release branch. + * Run the Python benchmark during the build, compare the results + of the static and shared builds. + * Fix invalid html in python2.5.devhelp.gz. + * Add a python2.5 console entry to the menu (hidden by default). + * python2.5: Suggest python-profiler. + + -- Matthias Klose Tue, 26 Sep 2006 02:36:11 +0200 + +python2.5 (2.5~c1-1) unstable; urgency=low + + * Python 2.5 release candidate 1. + * Update to trunk 20060818. + + -- Matthias Klose Sat, 19 Aug 2006 19:21:05 +0200 + +python2.5 (2.5~b3-1) unstable; urgency=low + + * Build the _ctypes module for m68k-linux. + + -- Matthias Klose Fri, 11 Aug 2006 18:19:19 +0000 + +python2.5 (2.5~b3-0ubuntu1) edgy; urgency=low + + * Python 2.5 beta3 release. + * Update to trunk 20060811. + * Rebuild the documentation. + * Fix value of sys.exec_prefix in the debug build. + * Do not build the library reference in info format; fails to build. + * Link the interpreter against the shared runtime library. With + gcc-4.1 the difference in the pystones benchmark dropped from about + 12% to about 6%. + * Install the statically linked version of the interpreter as + python2.5-static for now. + * Link the shared libpython with -O1. + + -- Matthias Klose Thu, 10 Aug 2006 14:04:48 +0000 + +python2.5 (2.4.3+2.5b2-3) unstable; urgency=low + + * Disable the testsuite on s390; don't care about "minimally configured" + buildd's. + + -- Matthias Klose Sun, 23 Jul 2006 11:45:03 +0200 + +python2.5 (2.4.3+2.5b2-2) unstable; urgency=low + + * Update to trunk 20060722. + * Merge idle-lib from idle-python2.5 into python2.5. + * Merge lib-tk from python-tk into python2.5. + * Tkinter.py: Suggest installation of python-tk package on failed + import of the _tkinter extension. + * Don't run the testsuite for the debug build on alpha. + * Don't run the test_compiler test on m68k. Just takes too long. + * Disable building ctypes on m68k (requires support for closures). + + -- Matthias Klose Sat, 22 Jul 2006 22:26:42 +0200 + +python2.5 (2.4.3+2.5b2-1) unstable; urgency=low + + * Python 2.5 beta2 release. + * Update to trunk 20060716. + * When built on a buildd, do not run the following test which try to + access the network: test_codecmaps_cn, test_codecmaps_hk, test_codecmaps_jp, + test_codecmaps_kr, test_codecmaps_tw, test_normalization. + * When built on a buildd, do not run tests requiring missing write permissions: + test_ossaudiodev. + + -- Matthias Klose Sun, 16 Jul 2006 02:53:50 +0000 + +python2.5 (2.4.3+2.5b2-0ubuntu1) edgy; urgency=low + + * Python 2.5 beta2 release. + + -- Matthias Klose Thu, 13 Jul 2006 17:16:52 +0000 + +python2.5 (2.4.3+2.5b1-1ubuntu2) edgy; urgency=low + + * Fix python-dev dependencies. + * Update to trunk 20060709. + + -- Matthias Klose Sun, 9 Jul 2006 18:50:32 +0200 + +python2.5 (2.4.3+2.5b1-1ubuntu1) edgy; urgency=low + + * Python 2.5 beta1 release. + * Update to trunk 20060623. + * Merge changes from the python2.4 packages. + * python2.5-minimal: Add _struct. + + -- Matthias Klose Fri, 23 Jun 2006 16:04:46 +0200 + +python2.5 (2.4.3+2.5a1-1) experimental; urgency=low + + * Update to trunk 20060409. + * Run testsuite for debug build as well. + * Build-depend on gcc-4.1. + + -- Matthias Klose Sun, 9 Apr 2006 22:27:05 +0200 + +python2.5 (2.4.3+2.5a1-0ubuntu1) dapper; urgency=low + + * Python 2.5 alpha1 release. + * Drop integrated patches. + * Add build dependencies on libsqlite3-dev and libffi4-dev. + * Add (build-)dependency on mime-support, libgpmg1 (test suite). + * Build using the system FFI. + * python2.5 provides python2.5-ctypes and python2.5-pysqlite2, + python2.5-elementtree. + * Move hashlib.py to python-minimal. + * Lib/hotshot/pstats.py: Error out on missing profile/pstats modules. + + -- Matthias Klose Wed, 5 Apr 2006 14:56:15 +0200 + +python2.4 (2.4.3-8ubuntu1) edgy; urgency=low + + * Resynchronize with Debian unstable. Remaining changes: + - Apply langpack-gettext patch. + - diff.gz contains pregenerated html and info docs. + - Build the -doc package from this source. + + -- Matthias Klose Thu, 22 Jun 2006 18:39:57 +0200 + +python2.4 (2.4.3-8) unstable; urgency=low + + * Remove python2.4's dependency on python-central. On installation of + the runtime, call hooks /usr/share/python/runtime.d/*.rtinstall. + On removal, call hooks /usr/share/python/runtime.d/*.rtremove. + Addresses: #372658. + * Call the rtinstall hooks only, if it's a new installation, or the first + installation using the hooks. Adresses: #373677. + + -- Matthias Klose Sun, 18 Jun 2006 00:56:13 +0200 + +python2.4 (2.4.3-7) unstable; urgency=medium + + * Reupload, depend on python-central (>= 0.4.15). + * Add build-conflict on python-xml. + + -- Matthias Klose Wed, 14 Jun 2006 18:56:57 +0200 + +python2.4 (2.4.3-6) medium; urgency=low + + * idle-python2.4: Remove the old postinst and prerm scripts. + * Name the runtime correctly in python2.4-minimal's installation + scripts. + + -- Matthias Klose Mon, 12 Jun 2006 17:39:56 +0000 + +python2.4 (2.4.3-5) unstable; urgency=low + + * python2.4-prerm: Handle the case, when python-central is not installed. + * idle-python2.4: Depend on python-tk instead of python2.4-tk. + + -- Matthias Klose Fri, 9 Jun 2006 05:17:17 +0200 + +python2.4 (2.4.3-4) unstable; urgency=low + + * SVN update up to 2006-06-07 + * Use python-central. + * Don't build the -tk and -gdbm packages from this source; now built + from the python-stdlib-extensions source. + * Remove leftover build dependency on libgmp3-dev. + * Do not build-depend on libbluetooth1-dev and libgpmg1-dev on + hurd-i386, kfreebsd-i386, kfreebsd-amd64. Closes: #365830. + * Do not run the test_tcl test; hangs for unknown reasons on at least + the following buildds: vivaldi(m68k), goedel (alpha), mayer (mipsel). + And no virtual package to file bug reports for the buildds ... + Closes: #364419. + * Move the Makefile from python2.4-dev to python2.4. Closes: #366473. + * Fix typo in pdb(1). Closes: #365772. + * New autoconf likes the mandir in /usr/share instead of /usr; work + with both locations. Closes: #367618. + + -- Matthias Klose Wed, 7 Jun 2006 21:37:20 +0200 + +python2.4 (2.4.3-3) unstable; urgency=low + + * SVN update up to 2006-04-21 + * Update locale aliases from /usr/share/X11/locale/locale.alias. + * Start idle with option -n from the desktop menu, so that the program + can be started in parallel. + * Testsuite related changes only: + - Add build dependencies mime-support, libgpmg1 (needed by test cases). + - Run the testsuite with bsddb, audio and curses resources enabled. + - Re-run the failed tests in verbose mode. + - Run the test suite for the debug build as well. + - Build depend on netbase, needed by test_socketmodule. + - Build depend on libgpmg1, needed by test_curses. + - On the buildds do not run the tests needing the network resource. + * Update python logo. + * Check for the availability of the profile and pstats modules when + importing hotshot.pstats. Closes: #334067. + * Don't build the -doc package from the python2.4 source. + * Set OPT in the installed Makefile to -O2. + + -- Matthias Klose Fri, 21 Apr 2006 19:58:43 +0200 + +python2.4 (2.4.3-2) unstable; urgency=low + + * Add (build-)dependency on mime-support. + + -- Matthias Klose Tue, 4 Apr 2006 22:21:41 +0200 + +python2.4 (2.4.3-1) unstable; urgency=low + + * Python 2.4.3 release. + + -- Matthias Klose Thu, 30 Mar 2006 23:42:37 +0200 + +python2.4 (2.4.3-0ubuntu1) dapper; urgency=low + + * Python 2.4.3 release. + - Fixed a bug that the gb18030 codec raises RuntimeError on encoding + surrogate pair area on UCS4 build. Ubuntu: #29289. + + -- Matthias Klose Thu, 30 Mar 2006 10:57:32 +0200 + +python2.4 (2.4.2+2.4.3c1-0ubuntu1) dapper; urgency=low + + * SVN update up to 2006-03-25 (2.4.3 candidate 1). + - Regenerate the documentation. + + -- Matthias Klose Mon, 27 Mar 2006 12:03:05 +0000 + +python2.4 (2.4.2-1ubuntu3) dapper; urgency=low + + * SVN update up to 2006-03-04 + - Regenerate the documentation. + - map.mmap(-1, size, ...) can return anonymous memory again on Unix. + Ubuntu #26201. + * Build-depend on libncursesw5-dev, ncursesw5 is preferred for linking. + Provides UTF-8 compliant curses bindings. + * Fix difflib where certain patterns of differences were making difflib + touch the recursion limit. + + -- Matthias Klose Sat, 4 Mar 2006 21:38:24 +0000 + +python2.4 (2.4.2-1ubuntu2) dapper; urgency=low + + * SVN update up to 2006-01-17 + - pwd is now a builtin module, remove it from python-minimal. + - Regenerate the documentation. + * python2.4-tk: Suggest tix instead of tix8.1. + * Move config/Makefile from the -dev package into the runtime package + to be able to use the bdist_wininst distutils command. Closes: #348335. + + -- Matthias Klose Tue, 17 Jan 2006 11:02:24 +0000 + +python2.4 (2.4.2-1ubuntu1) dapper; urgency=low + + * Temporarily remove build dependency on lsb-release. + + -- Matthias Klose Sun, 20 Nov 2005 17:40:18 +0100 + +python2.4 (2.4.2-1build1) dapper; urgency=low + + * Rebuild (openssl-0.9.8). + + -- Matthias Klose Sun, 20 Nov 2005 15:27:24 +0000 + +python2.4 (2.4.2-1) unstable; urgency=low + + * Python 2.4.2 release. + + -- Matthias Klose Thu, 29 Sep 2005 01:49:28 +0200 + +python2.4 (2.4.1+2.4.2rc1-1) unstable; urgency=low + + * Python 2.4.2 release candidate 1. + * Fix "Fatal Python error" from cStringIO's writelines. + Patch by Andrew Bennetts. + + -- Matthias Klose Thu, 22 Sep 2005 10:33:22 +0200 + +python2.4 (2.4.1-5) unstable; urgency=low + + * CVS update up to 2005-09-14 + - Regenerate the html and info docs. + * Add some more locale aliases. + * Fix substitution pf python version in README.python2.4-minimal. + Closes: #327487. + * On m68k, build using -O2 (closes: #326903). + * On Debian, don't configure --with-fpectl, which stopped working with + glibc-2.3.5. + + -- Matthias Klose Wed, 14 Sep 2005 17:32:56 +0200 + +python2.4 (2.4.1-4) unstable; urgency=low + + * CVS update up to 2005-09-04 + - teTeX 3.0 related fixes (closes: #322407). + - Regenerate the html and info docs. + * Add entry for IDLE in the Gnome menus. + * Don't build-depend on libbluetooth-dev on the Hurd (closes: #307037). + * Reenable the cthreads patch for the Hurd (closes: #307052). + + -- Matthias Klose Sun, 4 Sep 2005 18:31:42 +0200 + +python2.4 (2.4.1-3) unstable; urgency=low + + * Synchronise with Ubuntu: + - Build a python2.4-minimal package. + + -- Matthias Klose Tue, 12 Jul 2005 00:23:10 +0000 + +python2.4 (2.4.1-2ubuntu3) breezy; urgency=low + + * CVS update up to 2005-07-07 + * Regenerate the documentation. + + -- Matthias Klose Thu, 7 Jul 2005 09:21:28 +0200 + +python2.4 (2.4.1-2ubuntu2) breezy; urgency=low + + * CVS update up to 2005-06-15 + * Regenerate the documentation. + * Synchronize with Debian. Ubuntu 10485. + * idle-python2.4 enhances python2.4. Ubuntu 11562. + * README.Debian: Fix reference to the doc directory (closes: #311677). + + -- Matthias Klose Wed, 15 Jun 2005 08:56:57 +0200 + +python2.4 (2.4.1-2ubuntu1) breezy; urgency=low + + * Update build dependencies: + db4.2-dev -> db4.3-dev, + libreadline4-dev -> libreadline5-dev. + * python2.4-dev: Add missing templates to generate HTML docs. Ubuntu 11531. + + -- Matthias Klose Sun, 29 May 2005 00:01:05 +0200 + +python2.4 (2.4.1-2) unstable; urgency=low + + * Add the debug symbols for the python2.4, python2.4-gdbm + and python2.4-tk packages to the python2.4-dbg package. + * Add gdbinit example to doc directory. + + -- Matthias Klose Thu, 5 May 2005 11:12:32 +0200 + +python2.4 (2.4.1-1ubuntu2) breezy; urgency=low + + * Add the debug symbols for the python2.4, python2.4-minimal, python2.4-gdbm + and python2.4-tk packages to the python2.4-dbg package. Ubuntu 10261, + * Add gdbinit example to doc directory. + * For os.utime, use utimes(2), correctly working with glibc-2.3.5. + Ubuntu 10294. + + -- Matthias Klose Thu, 5 May 2005 09:06:07 +0200 + +python2.4 (2.4.1-1ubuntu1) breezy; urgency=low + + * Reupload as 2.4.1-1ubuntu1. + + -- Matthias Klose Thu, 14 Apr 2005 10:46:32 +0200 + +python2.4 (2.4.1-1) unstable; urgency=low + + * Python 2.4.1 release. + * Fix noise in python-doc installation/removal. + * New Python section for the info docs. + + -- Matthias Klose Wed, 30 Mar 2005 19:42:03 +0200 + +python2.4 (2.4.1-0) hoary; urgency=low + + * Python 2.4.1 release. + * Fix noise in python-doc installation/removal. + * New Python section for the info docs. + + -- Matthias Klose Wed, 30 Mar 2005 16:35:34 +0200 + +python2.4 (2.4+2.4.1rc2-2) unstable; urgency=low + + * Add the valgrind support file to /etc/python2.4 + * Build the -dbg package with -DPy_USING_MEMORY_DEBUGGER. + * Lib/locale.py: + - correctly parse LANGUAGE as a colon separated list of languages. + - prefer LC_ALL, LC_CTYPE and LANG over LANGUAGE to get the correct + encoding. + - Don't map 'utf8', 'utf-8' to 'utf', which is not a known encoding + for glibc. + * Fix two typos in python(1). Addresses: #300124. + + -- Matthias Klose Sat, 19 Mar 2005 21:50:14 +0100 + +python2.4 (2.4+2.4.1rc2-1) unstable; urgency=low + + * Python 2.4.1 release candidate 2. + * Build-depend on libbluetooth1-dev. + + -- Matthias Klose Sat, 19 Mar 2005 00:57:14 +0100 + +python2.4 (2.4dfsg-2) unstable; urgency=low + + * CVS update up to 2005-03-03 + + -- Matthias Klose Thu, 3 Mar 2005 22:22:16 +0100 + +python2.4 (2.4dfsg-1ubuntu4) hoary; urgency=medium + + * Move exception finalisation later in the shutdown process - this + fixes the crash seen in bug #1165761, taken from CVS. + * codecs.StreamReader: Reset codec when seeking. Ubuntu #6972. + * Apply fix for SF1124295, fixing an obscure bit of Zope's security machinery. + * distutils: Don't add standard library dirs to library_dirs + and runtime_library_dirs. On amd64, runtime paths pointing to /usr/lib64 + aren't recognized by dpkg-shlibdeps, and the packages containing these + libraries aren't added to ${shlibs:Depends}. + * Lib/locale.py: + - correctly parse LANGUAGE as a colon separated list of languages. + - prefer LC_ALL, LC_CTYPE and LANG over LANGUAGE to get the correct + encoding. + - Don't map 'utf8', 'utf-8' to 'utf', which is not a known encoding + for glibc. + * os.py: Avoid using items() in environ.update(). Fixes #1124513. + * Python/pythonrun.c: + * Build depend on locales, generate the locales needed for the + testsuite. + * Add build dependency on libbluetooth1-dev, adding some bluetooth + functionality to the socket module. + * Lib/test/test_sundry.py: Don't fail on import of profile & pstats, + which are separated out to the python-profiler package. + * Fix typos in manpage. + + -- Matthias Klose Tue, 29 Mar 2005 13:35:53 +0200 + + +python2.4 (2.4dfsg-1ubuntu3) hoary; urgency=low + + * debian/patches/langpack-gettext.dpatch: + - langpack support for python-gettext added + + -- Michael Vogt Tue, 1 Mar 2005 13:13:36 +0100 + +python2.4 (2.4dfsg-1ubuntu2) hoary; urgency=low + + * Revert 'essential' status on python2.4-minimal. This status on + on python-minimal is sufficient (Ubuntu #6392). + + -- Matthias Klose Wed, 9 Feb 2005 23:09:42 +0100 + +python2.4 (2.4dfsg-1ubuntu1) hoary; urgency=low + + * Resyncronise with Debian. + * Mark the python2.4-minimal package as 'essential'. + + -- Matthias Klose Wed, 9 Feb 2005 13:31:09 +0100 + +python2.4 (2.4dfsg-1) unstable; urgency=medium + + * Add licenses and acknowledgements for incorporated software in the + debian/copyright file (addresses: #293932). + * Replace md5 implementation with one having a DFSG conforming license. + * Remove the profile.py and pstats.py modules from the source package, + not having a DFSG conforming license. The modules can be found in + the python2.x-profile package in the non-free section. + Addresses: #293932. + * Add missing norwegian locales (Tollef Fog Heen). + * CVS updates of the release24-maint branch upto 2005-02-08 (date of + the Python 2.3.5 release). + + -- Matthias Klose Tue, 8 Feb 2005 19:13:10 +0100 + +python2.4 (2.4-7ubuntu1) hoary; urgency=low + + * Fix the name of the python-dbg man page. + * Resyncronise with Debian. + * Move more modules to -minimal (new code in copy.py requires these): + dis, inspect, opcode, token, tokenize. + + -- Matthias Klose Tue, 8 Feb 2005 19:13:10 +0100 + +python2.4 (2.4-7) unstable; urgency=medium + + * Add licenses and acknowledgements for incorporated software in the + debian/copyright file (addresses: #293932). + * Replace md5 implementation with one having a DFSG conforming license. + * Add missing norwegian locales (Tollef Fog Heen). + * CVS updates of the release24-maint branch upto 2005-02-08 (date of + the Python 2.3.5 release). + + -- Matthias Klose Tue, 8 Feb 2005 19:13:10 +0100 + +python2.4 (2.4-6) unstable; urgency=low + + * Build a python2.4-dbg package using --with-pydebug. Add a debug + directory /lib-dynload/debug to sys.path instead of + /lib-dynload und install the extension modules of the + debug build in this directory. + Change the module load path to load extension modules from other + site-packages/debug directories (for further details see the + README in the python2.4-dbg package). Closes: #5415. + * Apply the pydebug-path patch. The package was already built in -5. + + -- Matthias Klose Fri, 4 Feb 2005 22:15:13 +0100 + +python2.4 (2.4-5) unstable; urgency=high + + * Fix a flaw in SimpleXMLRPCServerthat can affect any XML-RPC servers. + This affects any programs have been written that allow remote + untrusted users to do unrestricted traversal and can allow them to + access or change function internals using the im_* and func_* attributes. + References: CAN-2005-0089. + * CVS updates of the release24-maint branch upto 2005-02-04. + + -- Matthias Klose Fri, 4 Feb 2005 08:12:10 +0100 + +python2.4 (2.4-4) unstable; urgency=medium + + * Update debian/copyright to the 2.4 license text (closes: #290898). + * Remove /usr/bin/smtpd.py (closes: #291049). + + -- Matthias Klose Mon, 17 Jan 2005 23:54:37 +0100 + +python2.4 (2.4-3ubuntu6) hoary; urgency=low + + * Use old-style dpatches instead of dpatch-run. + + -- Tollef Fog Heen Mon, 7 Feb 2005 15:58:05 +0100 + +python2.4 (2.4-3ubuntu5) hoary; urgency=low + + * Actually apply the patch as well (add to list of patches in + debian/rules) + + -- Tollef Fog Heen Sun, 6 Feb 2005 15:12:58 +0100 + +python2.4 (2.4-3ubuntu4) hoary; urgency=low + + * Add nb_NO and nn_NO locales to Lib/locale.py + + -- Tollef Fog Heen Sun, 6 Feb 2005 14:33:05 +0100 + +python2.4 (2.4-3ubuntu3) hoary; urgency=low + + * Fix a flaw in SimpleXMLRPCServerthat can affect any XML-RPC servers. + This affects any programs have been written that allow remote + untrusted users to do unrestricted traversal and can allow them to + access or change function internals using the im_* and func_* attributes. + References: CAN-2005-0089. + + -- Matthias Klose Wed, 2 Feb 2005 09:08:20 +0000 + +python2.4 (2.4-3ubuntu2) hoary; urgency=low + + * Build a python2.4-dbg package using --with-pydebug. Add a debug + directory /lib-dynload/debug to sys.path instead of + /lib-dynload und install the extension modules of the + debug build in this directory. + Change the module load path to load extension modules from other + site-packages/debug directories (for further details see the + README in the python2.4-dbg package). Closes: #5415. + * Update debian/copyright to the 2.4 license text (closes: #290898). + * Add operator and copy to the -minimal package. + + -- Matthias Klose Mon, 17 Jan 2005 23:19:47 +0100 + +python2.4 (2.4-3ubuntu1) hoary; urgency=low + + * Resynchronise with Debian. + * python2.4: Depend on the very same version of python2.4-minimal. + * Docment, that time.strptime currently cannot be used, if the + python-minimal package is installed without the python package. + + -- Matthias Klose Sun, 9 Jan 2005 19:35:48 +0100 + +python2.4 (2.4-3) unstable; urgency=medium + + * Build the fpectl module. + * Updated to CVS release24-maint 20050107. + + -- Matthias Klose Sat, 8 Jan 2005 19:05:21 +0100 + +python2.4 (2.4-2ubuntu5) hoary; urgency=low + + * Updated to CVS release24-maint 20050102. + * python-minimal: + - os.py: Use dict instead of UserDict, remove UserDict from -minimal. + - add pickle, threading, needed for subprocess module. + - optparse.py: conditionally import gettext, if not available, + define _ as the identity function. Patch taken from the trunk. + Avoids import of _locale, locale, gettext, copy, repr, itertools, + collections, token, tokenize. + - Add a build check to make sure that the minimal module list is + closed under dependency. + * Fix lintian warnings. + + -- Matthias Klose Sun, 2 Jan 2005 22:00:14 +0100 + +python2.4 (2.4-2ubuntu4) hoary; urgency=low + + * Add UserDict.py to the -minimal package, since os.py needs it. + + -- Colin Watson Thu, 30 Dec 2004 20:41:28 +0000 + +python2.4 (2.4-2ubuntu3) hoary; urgency=low + + * Add os.py and traceback.py to the -minimal package, get the list + of modules from the README. + + -- Matthias Klose Mon, 27 Dec 2004 08:20:45 +0100 + +python2.4 (2.4-2ubuntu2) hoary; urgency=low + + * Add compileall.py and py_compile.py to the -minimal package, not + just to the README ... + + -- Matthias Klose Sat, 25 Dec 2004 22:24:56 +0100 + +python2.4 (2.4-2ubuntu1) hoary; urgency=low + + * Separate the interpreter and a minimal subset of modules into + a python2.4-minimal package. See the README.Debian.gz in this + package. + * Move site.py to python2.4-minimal as well. + * Add documentation files for devhelp. + + -- Matthias Klose Sun, 19 Dec 2004 22:47:32 +0100 + +python2.4 (2.4-2) unstable; urgency=medium + + * Updated patch for #283108. Thanks to Jim Meyering. + + -- Matthias Klose Fri, 3 Dec 2004 17:00:16 +0100 + +python2.4 (2.4-1) unstable; urgency=low + + * Final 2.4 release. + * Flush stdout/stderr if closed (SF #1074011). + + -- Matthias Klose Wed, 1 Dec 2004 07:54:34 +0100 + +python2.4 (2.3.97-2) unstable; urgency=low + + * Don't run test_tcl, hanging on the buildds. + + -- Matthias Klose Fri, 19 Nov 2004 23:48:42 +0100 + +python2.4 (2.3.97-1) unstable; urgency=low + + * Python 2.4 Release Candidate 1. + + -- Matthias Klose Fri, 19 Nov 2004 21:27:02 +0100 + +python2.4 (2.3.96-1) experimental; urgency=low + + * Updated to CVS release24-maint 20041113. + * Build the docs in info format again. + + -- Matthias Klose Sat, 13 Nov 2004 21:21:10 +0100 + +python2.4 (2.3.95-2) experimental; urgency=low + + * Move distutils package from the python2.4-dev into the python2.4 + package. + + -- Matthias Klose Thu, 11 Nov 2004 22:56:14 +0100 + +python2.4 (2.3.95-1) experimental; urgency=low + + * Python 2.4 beta2 release. + + -- Matthias Klose Thu, 4 Nov 2004 23:43:47 +0100 + +python2.4 (2.3.94-1) experimental; urgency=low + + * Python 2.4 beta1 release. + + -- Matthias Klose Sat, 16 Oct 2004 08:33:57 +0200 + +python2.4 (2.3.93-1) experimental; urgency=low + + * Python 2.4 alpha3 release. + + -- Matthias Klose Fri, 3 Sep 2004 21:53:47 +0200 + +python2.4 (2.3.92-1) experimental; urgency=low + + * Python 2.4 alpha2 release. + + -- Matthias Klose Thu, 5 Aug 2004 23:53:18 +0200 + +python2.4 (2.3.91-1) experimental; urgency=low + + * Python 2.4 alpha1 release. + Highlights: http://www.python.org/2.4/highlights.html + + -- Matthias Klose Fri, 9 Jul 2004 17:38:54 +0200 + +python2.4 (2.3.90-1) experimental; urgency=low + + * Package HEAD branch (pre alpha ..). + + -- Matthias Klose Mon, 14 Jun 2004 23:19:57 +0200 + +python2.3 (2.3.4-1) unstable; urgency=medium + + * Final Python 2.3.4 Release. + * In the API docs, fix signature of PyModule_AddIntConstant (closes: #250826). + * locale.getdefaultlocale: don't fail with empty environment variables. + Closes: #249816. + * Include distutils/command/wininst.exe in -dev package (closes: #249006). + * Disable cthreads on the Hurd (Michael Banck). Closes: #247211. + * Add a note to pygettext(1), that this program is deprecated in favour + of xgettext, which now includes support for Python as well. + Closes: #246332. + + -- Matthias Klose Fri, 28 May 2004 22:59:42 +0200 + +python2.3 (2.3.3.91-1) unstable; urgency=low + + * Python 2.3.4 Release Candidate 1. + * Do not use the default namespace for attributes. Patch taken from the + 2.3 maintenance branch. + The xmllib module is obsolete. Use xml.sax instead. + * http://python.org/sf/945642 - fix nonblocking i/o with ssl socket. + + -- Matthias Klose Thu, 13 May 2004 21:24:52 +0200 + +python2.3 (2.3.3-7) unstable; urgency=low + + * Add a workaround for GNU libc nl_langinfo()'s returning NULL. + Closes: #239237. + Patch taken from 2.3 maintenance branch. + * threading.py: Remove calls to currentThread() in _Condition methods that + were side-effect. Side-effects were deemed unnecessary and were causing + problems at shutdown time when threads were catching exceptions at start + time and then triggering exceptions trying to call currentThread() after + gc'ed. Masked the initial exception which was deemed bad. + Closes: #195812. + * Properly support normalization of empty unicode strings. Closes: #239986. + Patch taken from 2.3 maintenance branch. + * README.maintainers: Add section where to find the documentation tools. + * Fix crash in pyexpat module (closes: #229281). + * For the Hurd, set the interpreters recursion limit to 930. + * Do not try to byte-compile the test files on installation; this + currently breaks the Hurd install. + + -- Matthias Klose Sat, 1 May 2004 07:50:46 +0200 + +python2.3 (2.3.3-6) unstable; urgency=low + + * Don't build the unversioned python{,-*} packages anymore. Now + built from the python-defaults package. + * Update to the proposed python-policy: byte-compile using -E. + * Remove python-elisp's dependency on emacs20 (closes: #232785). + * Don't build python-elisp from the python2.3 source anymore, + get it from python-mode.sf.net as a separate source package. + * python2.3-dev suggests libc-dev (closes: #231091). + * get LDSHARED and CCSHARED (like, CC, CXX, CPP, CFLAGS) from + the environment + * Set CXX in installed config/Makefile (closes: #230273). + + -- Matthias Klose Tue, 24 Feb 2004 07:07:51 +0100 + +python2.3 (2.3.3-5) unstable; urgency=low + + * Build-depend on libdb4.2-dev, instead of libdb4.1-dev. According + to the docs the file format is compatible. + + -- Matthias Klose Mon, 12 Jan 2004 10:37:45 +0100 + +python2.3 (2.3.3-4) unstable; urgency=low + + * Fix broken _bsddb module. setup.py picked up the wrong library. + + -- Matthias Klose Sun, 4 Jan 2004 11:30:00 +0100 + +python2.3 (2.3.3-3) unstable; urgency=low + + * Fix typo in patch (closes: #224797, #226064). + + -- Matthias Klose Sun, 4 Jan 2004 09:23:21 +0100 + +python2.3 (2.3.3-2) unstable; urgency=medium + + * Lib/email/Charset: use locale unaware function to lower case of locale + name (closes: #224797). + * Update python-mode to version from python-mode.sf.net. Fixes highlighting + problems (closes: #223520). + * Backport from mainline: Add IPV6_ socket options from RFCs 3493 and 3542. + + -- Matthias Klose Fri, 2 Jan 2004 14:03:26 +0100 + +python2.3 (2.3.3-1) unstable; urgency=low + + * New upstream release. + * Copy the templates, tools and scripts from the Doc dir in the source + to /usr/share/lib/python2.3/doc in the python2.3-dev package. Needed + for packages building documentation like python does (closes: #207337). + + -- Matthias Klose Fri, 19 Dec 2003 10:57:39 +0100 + +python2.3 (2.3.2.91-1) unstable; urgency=low + + * New upstream version (2.3.3 release candidate). + * Update python-mode.el (closes: #158811, #159630). + Closing unreproducible report (closes: #159628). + + -- Matthias Klose Sat, 6 Dec 2003 14:41:14 +0100 + +python2.3 (2.3.2-7) unstable; urgency=low + + * Put the conflict in the correct direction. python2.3 (2.3.2-6) doesn't + conflict with python (<= 2.3.2-5) but python (2.3.2-6) conflicts with + python2.3 (<= 2.3.2-5) (thanks to Brian May). Really closes #221791. + + -- Matthias Klose Fri, 21 Nov 2003 00:20:02 +0100 + +python2.3 (2.3.2-6) unstable; urgency=low + + * Add conflicts with older python{,2.3} packages to fix overwrite + errors (closes: #221791). + + -- Matthias Klose Thu, 20 Nov 2003 07:24:36 +0100 + +python2.3 (2.3.2-5) unstable; urgency=low + + * Updated to CVS release23-maint 20031119. + * Re-upgrade the dependency of python2.3 on python (>= 2.3) to + a dependency (closes: #221523). + + -- Matthias Klose Wed, 19 Nov 2003 00:30:27 +0100 + +python2.3 (2.3.2-4) unstable; urgency=low + + * Don't build-depend on latex2html (moved to non-free), but keep + the prebuilt docs in debian/patches (closes: #221347). + * Fix typos in the library reference (closes: #220510, #220954). + * Fix typo in python-elisp's autoloading code (closes: #220308). + * Update proposed python policy: private modules can be installed + into /usr/lib/ (arch dependent) and into /usr/share/ + (arch independent). + + -- Matthias Klose Tue, 18 Nov 2003 00:41:39 +0100 + +python2.3 (2.3.2-3) unstable; urgency=low + + * Downgrade the dependency of python2.3 on python (>= 2.3) to + a recommendation. + * Fix path to interpreter in binfmt file. + * Fix segfault in unicodedata module (closes: #218697). + * Adjust python-elisp autoload code (closes: #219821). + + -- Matthias Klose Sun, 9 Nov 2003 19:43:37 +0100 + +python2.3 (2.3.2-2) unstable; urgency=medium + + * Fix broken doc link (closes: #214217). + * Disable wrongly detected large file support for GNU/Hurd. + * Really fix the FTBFS for the binary-indep target (closes: #214303). + + -- Matthias Klose Mon, 6 Oct 2003 07:54:58 +0200 + +python2.3 (2.3.2-1) unstable; urgency=low + + * New upstream version. + * Fix a FTBFS for the binary-indep target. + + -- Matthias Klose Sat, 4 Oct 2003 10:20:15 +0200 + +python2.3 (2.3.1-3) unstable; urgency=low + + * Fix names of codec packages in recommends. + * On alpha compile using -mieee (see #212912). + + -- Matthias Klose Sun, 28 Sep 2003 10:48:12 +0200 + +python2.3 (2.3.1-2) unstable; urgency=low + + * Update python policy draft (closes: #128911, #163785). + * Re-add os.fsync function (closes: #212672). + * Let python2.3-doc conflict with older python2.3 versions (closes: #211882). + * Add recommends for pythonX.Y-japanese-codecs, pythonX.Y-iconvcodec, + pythonX.Y-cjkcodecs, pythonX.Y-korean-codecs (closes: #207161). + * Generate binfmt file (closes: #208005). + * Add IPPROTO_IPV6 option to the socketmodule (closes: #206569). + * Bugs reported against python2.2 and fixed in python2.3: + - Crashes in idle (closes: #186887, #200084). + + -- Matthias Klose Sat, 27 Sep 2003 11:21:47 +0200 + +python2.3 (2.3.1-1) unstable; urgency=low + + * New upstream version (bug fix release). + + -- Matthias Klose Wed, 24 Sep 2003 11:27:43 +0200 + +python2.3 (2.3-4) unstable; urgency=high + + * Disable check for utimes function, which is broken in glibc-2.3.2. + Packages using distutils had '1970/01/01-01:00:01' timestamps in files. + * Bugs fixed by making python2.3 the default python version: + - Canvas.scan_dragto() takes a 3rd optional parmeter "gain". + Closes: #158168. + - New command line parsing module (closes: #38628). + - compileall.py allows compiling single files (closes: #139971). + * Bugs reported for 2.2 and fixed in 2.3: + - Idle does save files with ASCII characters (closes: #179313). + - imaplib support for prefix-quoted strings (closes: #150485). + - posixpath includes getctime (closes: #173827). + - pydoc has support for keywords (closes: #186775). + * Bugs reported for 2.1 and fixed in 2.3: + - Fix handling of "#anchor" URLs in urlparse (closes: #147844). + - Fix readline if C stdin is not a tty, even if sys.stdin is. + Closes: #131810. + * Updated to CVS release23-maint 20030810 (fixing memory leaks in + array and socket modules). + * pydoc's usage output uses the basename of the script. + * Don't explicitely remove /etc/python2.3 on purge (closes: #202864). + * python conflicts with python-xmlbase (closes: #204773). + * Add dependency python (>= 2.3) to python2.3, so make sure the + unversioned names can be used. + + -- Matthias Klose Sun, 10 Aug 2003 09:27:52 +0200 + +python2.3 (2.3-3) unstable; urgency=medium + + * Fix shlibs file. + + -- Matthias Klose Fri, 8 Aug 2003 08:45:12 +0200 + +python2.3 (2.3-2) unstable; urgency=medium + + * Make python2.3 the default python version. + + -- Matthias Klose Tue, 5 Aug 2003 22:13:22 +0200 + +python2.3 (2.3-1) unstable; urgency=low + + * Python 2.3 final release. + + -- Matthias Klose Wed, 30 Jul 2003 08:12:28 +0200 + +python2.3 (2.2.107-1rc2) unstable; urgency=medium + + * Python 2.3 release candidate 2. + * Don't compress .txt files referenced by the html docs (closes: #200298). + * Include the email/_compat* files (closes: #200349). + + -- Matthias Klose Fri, 25 Jul 2003 07:08:09 +0200 + +python2.3 (2.2.106-2beta2) unstable; urgency=medium + + * Python 2.3 beta2 release, updated to CVS 20030704. + - Fixes AssertionError in httplib (closed: #192452). + - Fixes uncaught division by zero in difflib.py (closed: #199287). + * Detect presence of setgroups(2) at configure time (closes: #199839). + * Use default gcc on arm as well. + + -- Matthias Klose Sat, 5 Jul 2003 10:21:33 +0200 + +python2.3 (2.2.105-1beta2) unstable; urgency=low + + * Python 2.3 beta2 release. + - Includes merged idle fork. + - Fixed socket.setdefaulttimeout(). Closes: #189380. + - socket.ssl works with _socketobj. Closes: #196082. + * Do not link libtix to the _tkinter module. It's loaded via + 'package require tix' at runtime. python2.3-tkinter now + suggests tix8.1 instead. + * On arm, use gcc-3.2 to build. + * Add -fno-strict-aliasing rules to OPT to avoid warnings + "dereferencing type-punned pointer will break strict-aliasing rules", + when building with gcc-3.3. + + -- Matthias Klose Mon, 30 Jun 2003 00:19:32 +0200 + +python2.3 (2.2.104-1beta1.1) unstable; urgency=low + + * Non-maintainer upload with maintainer consent. + * debian/control (Build-Depends): s/libgdbmg1-dev/libgdbm-dev/. + + -- James Troup Wed, 4 Jun 2003 02:24:27 +0100 + +python2.3 (2.2.104-1beta1) unstable; urgency=low + + * Python 2.3 beta1 release, updated to CVS 20030514. + - build the current documentation. + * Reenable Tix support. + + -- Matthias Klose Wed, 14 May 2003 07:38:57 +0200 + +python2.3 (2.2.103-1beta1) unstable; urgency=low + + * Python 2.3 beta1 release, updated to CVS 20030506. + - updated due to build problems on mips/mipsel. + - keep the 2.3b1 documentation (doc build problems with cvs). + + -- Matthias Klose Wed, 7 May 2003 06:26:39 +0200 + +python2.3 (2.2.102-1beta1) unstable; urgency=low + + * Python 2.3 beta1 release. + + -- Matthias Klose Sat, 3 May 2003 22:45:16 +0200 + +python2.3 (2.2.101-1exp1) unstable; urgency=medium + + * Python 2.3 alpha2 release, updated to CVS 20030321. + * Tkinter: Catch exceptions thrown for undefined substitutions in + events (needed for tk 8.4.2). + + -- Matthias Klose Fri, 21 Mar 2003 21:32:14 +0100 + +python2.3 (2.2.100-1exp1) unstable; urgency=low + + * Python 2.3 alpha2 release, updated to CVS 20030221. + + -- Matthias Klose Fri, 21 Feb 2003 19:37:17 +0100 + +python2.3 (2.2.99-1exp1) unstable; urgency=low + + * Python 2.3 alpha1 release updated to CVS 20030123. + - should fix the testsuite (and package build) failure on alpha. + * Remove build dependency on libexpat1-dev. Merge the python2.3-xmlbase + package into python2.3 (closes: #177739). + + -- Matthias Klose Thu, 23 Jan 2003 22:48:12 +0100 + +python2.3 (2.2.98-1exp1) unstable; urgency=low + + * Python 2.3 alpha1 release updated to CVS 20030117. + * Build using libdb4.1. + + -- Matthias Klose Sat, 18 Jan 2003 00:14:01 +0100 + +python2.3 (2.2.97-1exp1) unstable; urgency=low + + * Python 2.3 alpha1 release updated to CVS 20030109. + * Build-Depend on g++ (>= 3:3.2). + * Python package maintainers: please wait uploading python dependent + packages until python2.2 and python2.1 are compiled using gcc-3.2. + + -- Matthias Klose Thu, 9 Jan 2003 23:56:42 +0100 + +python2.3 (2.2.96-1exp1) unstable; urgency=low + + * Python 2.3 alpha1 release (not exactly the tarball, but taken from + CVS 20030101). + - Includes support for linking with threaded tk8.4 (closes: #172714). + * Install and register whatsnew document (closes: #173859). + * Properly unregister info documentation. + + -- Matthias Klose Wed, 1 Jan 2003 17:38:54 +0100 + +python2.3 (2.2.95-1exp1) unstable; urgency=low + + * Experimental packages from CVS 021212. + - data in unicodedate module is up to date (closes: #171061). + * Fix idle packaging (closes: #170394). + * Configure using unicode UCS-4 (closes: #171062). + This change breaks compatibility with binary modules, but what do you + expect from experimental packages ... Please recompile dependent packages. + * Don't strip binaries for now. + + -- Matthias Klose Thu, 12 Dec 2002 21:42:27 +0100 + +python2.3 (2.2.94-1exp1) unstable; urgency=low + + * Experimental packages from CVS 021120. + * Remove outdated README.dbm. + * Depend on tk8.4. + * python-elisp: Install emacsen install file with mode 644 (closes: #167718). + + -- Matthias Klose Thu, 21 Nov 2002 01:04:51 +0100 + +python2.3 (2.2.93-1exp1) unstable; urgency=medium + + * Experimental packages from CVS 021015. + * Build a static library libpython2.3-pic.a. + * Enable large file support for the Hurd (closes: #164602). + + -- Matthias Klose Tue, 15 Oct 2002 21:06:27 +0200 + +python2.3 (2.2.92-1exp1) unstable; urgency=low + + * Experimental packages from CVS 020922. + * Fix build error on ia64 (closes: #161234). + * Build depend on gcc-3.2-3.2.1-0pre2 to fix build error on arm. + + -- Matthias Klose Sun, 22 Sep 2002 18:30:28 +0200 + +python2.3 (2.2.91-1exp1) unstable; urgency=low + + * Experimental packages from CVS 020906. + * idle-python2.3: Fix conflict (closes: #159267). + * Fix location of python-mode.el (closes: #159564, #159619). + * Use tix8.1. + * Apply fix for distutils/ccompiler problem (closes: #159288). + + -- Matthias Klose Sat, 7 Sep 2002 09:55:07 +0200 + +python2.3 (2.2.90-1exp1) unstable; urgency=low + + * Experimental packages from CVS 020820. + * Don't build python2.3-elisp, but put the latest version into + python-elisp. + + -- Matthias Klose Thu, 22 Aug 2002 21:52:04 +0200 + +python2.2 (2.2.1-6) unstable; urgency=low + + * CVS updates of the release22-maint branch upto 2002-07-23. + * Enable IPv6 support (closes: #152543). + * Add python2.2-tk suggestion for python2.2 (pydoc -g). + * Fix from SF patch #527518: proxy config with user+pass authentication. + * Point pydoc to the correct location of the docs (closes: #147579). + * Remove '*.py[co]' files, when removing the python package, + not when purging (closes: #147130). + * Update to new py2texi.el version (Milan Zamazal). + + -- Matthias Klose Mon, 29 Jul 2002 23:11:32 +0200 + +python2.2 (2.2.1-5) unstable; urgency=low + + * CVS updates of the release22-maint branch upto 2002-05-03. + * Build the info docs (closes: #145653). + + -- Matthias Klose Fri, 3 May 2002 22:35:46 +0200 + +python2.2 (2.2.1-4) unstable; urgency=high + + * Fix indentation errors introduced in last upload (closes: #143809). + + -- Matthias Klose Sun, 21 Apr 2002 01:00:14 +0200 + +python2.2 (2.2.1-3) unstable; urgency=high + + * Add Build-Conflicts: tcl8.0-dev, tk8.0-dev, tcl8.2-dev, tk8.2-dev. + Closes: #143534 (build a working _tkinter module, on machines, where + 8.0's tk.h gets included). + * CVS updates of the release22-maint branch upto 2002-04-20. + + -- Matthias Klose Sat, 20 Apr 2002 09:22:37 +0200 + +python2.2 (2.2.1-2) unstable; urgency=low + + * Forgot to copy the dlmodule patch from the 2.1.3 package. Really + closes: #141681. + + -- Matthias Klose Sat, 13 Apr 2002 01:28:05 +0200 + +python2.2 (2.2.1-1) unstable; urgency=high + + * Final 2.2.1 release. + * According to report #131813, the python interpreter is much faster on some + architectures, when beeing linked statically with the python library (25%). + Gregor and me tested on i386, m68k and alpha, but we could not reproduce + such a speedup (generally between 5% and 10%). But we are linking the + python executable now statically ... + * Build info docs from the tex source, merge the python-doc-info + package into the python-doc package. + * Always build the dl module. Failure in case of + sizeof(int)!=sizeof(long)!=sizeof(void*) + is delayed until dl.open is called. Closes: #141681. + + -- Matthias Klose Thu, 11 Apr 2002 00:19:19 +0200 + +python2.2 (2.2.0.92-0) unstable; urgency=low + + * Package CVS sources, omit cvs-updates.dpatch (closes: #140977). + + -- Matthias Klose Wed, 3 Apr 2002 08:20:52 +0200 + +python2.2 (2.2-6) unstable; urgency=medium + + * Update to python-2.2.1 release candidate 2 (final release scheduled + for April 10). + * Enable dl module (closes: #138992). + * Build doc files with python binary from package (closes: #139657). + * Build _tkinter module with BLT and Tix support. + * python2.2-elisp: Conflict with python2-elisp (closes: #138970). + * string.split docs updated in python-2.2.1 (closes: #129272). + + -- Matthias Klose Mon, 1 Apr 2002 13:52:36 +0200 + +python2.2 (2.2-5) unstable; urgency=low + + * CVS updates of the release22-maint branch upto 20020310 (aproaching + the first 2.2.1 release candidate). + * Stolen from HEAD: check argument of locale.nl_langinfo (closes: #137371). + + -- Matthias Klose Fri, 15 Mar 2002 01:05:59 +0100 + +python2.2 (2.2-4) unstable; urgency=medium + + * Include test/{__init__.py,README,pystone.py} in package (closes: #129013). + * Fix python-elisp conflict (closes: #129046). + * Don't compress stylesheets (closes: #133179). + * CVS updates of the release22-maint branch upto 20020310. + + -- Matthias Klose Sun, 10 Mar 2002 23:32:28 +0100 + +python2.2 (2.2-3) unstable; urgency=medium + + * Updates from the CVS python22-maint branch up to 20020107. + webbrowser.py: properly escape url's. + * The Hurd does not have large file support: disabled. + + -- Matthias Klose Mon, 7 Jan 2002 21:55:57 +0100 + +python2.2 (2.2-2) unstable; urgency=medium + + * CVS updates of the release22-maint branch upto 20011229. Fixes: + - Include TCP_CORK flag in plat-linux2 headers (fixes: #84340). + - Update CDROM.py module (fixes: #125785). + * Add missing chunk of the GNU/Hurd patch (therefore urgency medium). + * Send anonymous password when using anonftp (closes: #126814). + + -- Matthias Klose Sat, 29 Dec 2001 20:18:26 +0100 + +python2.2 (2.2-1) unstable; urgency=low + + * New upstream version: 2.2. + * Bugs fixed upstream: + - Docs for os.kill reference the signal module for constants. + - Documentation strings in the tutorial end with a period (closes: #94770). + - Tk: grid_location method moved from Grid to Misc (closes: #98338). + - mhlib.SubMessage.getbodytext takes decode parameter (closes: #31876). + - Strings in modules are locale aware (closes: #51444). + - Printable 8-bit characters in strings are correctly printed + (closes: #64354). + - Dictionary can be updated with abstract mapping object (closes: #46566). + * Make site.py a config files. + + -- Matthias Klose Sat, 22 Dec 2001 00:51:46 +0100 + +python2.2 (2.1.99c1-1) unstable; urgency=low + + * New upstream version: 2.2c1 (release candidate). + * Do not provide python2.2-base anymore. + * Install correct README.Debian for python2.2 package. Include hint + where to find Makefile.pre.in. + * Suggest installation of python-ssl. + * Remove idle config files on purge. + * Remove empty /usr/lib/python2.2 directory on purge. + + -- Matthias Klose Sat, 15 Dec 2001 17:56:27 +0100 + +python2.2 (2.1.99beta2-1) unstable; urgency=high + + * debian/rules: Reflect removal of regrtest package (closes: #122278). + Resulted in build failures on all architectures. + * Build -doc package from source. + + -- Matthias Klose Sat, 8 Dec 2001 00:38:41 +0100 + +python2.2 (2.1.99beta2-0.1) unstable; urgency=low + + * Non maintainer upload. + * New upstream version (this is 2.2beta2). + * Do not build the python-regrtest package anymore; keep the test framework + components test/regrtest.py and test/test_support.py in the python + package (closes: #119408). + + -- Gregor Hoffleit Tue, 27 Nov 2001 09:53:26 +0100 + +python2.2 (2.1.99beta1-4) unstable; urgency=low + + * Configure with --with-fpectl (closes: #118125). + * setup.py: Remove broken check for _curses_panel module (#116081). + * idle: Move config-* files to /etc and mark as conffiles (#106390). + * Move idle packages to section `devel'. + + -- Matthias Klose Wed, 31 Oct 2001 10:56:45 +0100 + +python2.2 (2.1.99beta1-3) unstable; urgency=low + + * Fix shlibs file (was still referring to 2.1). Closes: #116810. + * README.Debian: point to draft of python-policy in the python package. + + -- Matthias Klose Wed, 31 Oct 2001 10:56:45 +0100 + +python2.2 (2.1.99beta1-2) unstable; urgency=medium + + * Fix shlibs file (was still referring to 2.1). Closes: #116810. + * Rename package python2.2-base to python2.2. + + -- Matthias Klose Wed, 24 Oct 2001 23:00:50 +0200 + +python2.2 (2.1.99beta1-1) unstable; urgency=low + + * New upstream version (beta). Call the package version 2.1.99beta1-1. + * New maintainer until the final 2.2 release. + * Updated the debian patches. + + -- Matthias Klose Sat, 20 Oct 2001 18:56:26 +0200 + +python2.1 (2.1.1-1.2) unstable; urgency=low + + * Really remove the python alternative. + + -- Matthias Klose Sat, 20 Oct 2001 15:16:56 +0200 + +python2.1 (2.1.1-1.1) unstable; urgency=low + + * README FOR PACKAGE MAINTAINERS: It is planned to remove the python2-XXX + packages from unstable and move on to python2.1. + If you repackage/adapt your modules for python2.1, don't build + python2-XXX and python2.1-XXX packages from the same source package, + so that the python2-XXX package can be removed without influencing the + python2.1-XXX package. + + See the debian-python mailing list at http://lists.debian.org/devel.html + for details and the current discussion and a draft for a debian-python + policy (August to October 2001). + + * Remove alternative for /usr/bin/python. The python-base package now + provides the default python version. + + * Regenerate control file to fix build dependencies (closes: #116190). + * Remove alternative for /usr/bin/{python,pydoc}. + * Provide a libpython2.1.so symlink in /usr/lib/python2.1/config, + so that the shared library is found when -L/usr/lib/python2.1/config + is specified. + * Conflict with old package versions, where /usr/bin/python is a real + program (closes: #115943). + * python2.1-elisp conflicts with python-elisp (closes: #115895). + * We now have 2.1 (closes: #96851, #107849, #110243). + + -- Matthias Klose Fri, 19 Oct 2001 17:34:41 +0200 + +python2.1 (2.1.1-1) unstable; urgency=low + + * Incorporated Matthias' modifications. + + -- Gregor Hoffleit Thu, 11 Oct 2001 00:16:42 +0200 + +python2.1 (2.1.1-0.2) unstable; urgency=low + + * New upstream 2.1.1. + * GPL compatible licence (fixes #84080, #102949, #110643). + * Fixed upstream (closes: #99692, #111340). + * Build in separate build directory. + * Split Debian patches into debian/patches directory. + * Build dependencies: Add libgmp3-dev, libexpat1-dev, tighten + debhelper dependency. + * debian/rules: Updated a "bit". + * python-elisp: Remove custom dependency (closes: #87783), + fix emacs path (closes: #89712), remove emacs19 dependency (#82694). + * Mention distutils in python-dev package description (closes: #108170). + * Update README.Debian (closes: #85430). + * Run versioned python in postinsts (closes: #113349). + * debian/sample.{postinst,prerm}: Change to version independent scripts. + * Use '/usr/bin/env python2.1' as interpreter for all python scripts. + * Add libssl-dev to Build-Conflicts. + * python-elisp: Add support for emacs21 (closes: #98635). + * Do not compress .py files in doc directories. + * Don't link explicitely with libc. + + -- Matthias Klose Wed, 3 Oct 2001 09:53:08 +0200 + +python2.1 (2.1.1-0.1) unstable; urgency=low + + * New upstream version (CVS branch release21-maint, will become 2.1.1): + This CVS branch will be released as 2.1.1 under a GPL compatible + license. + + -- Gregor Hoffleit Wed, 27 Jun 2001 22:47:58 +0200 + +python2 (2.1-0.1) unstable; urgency=low + + * Fixed Makefile.pre.in. + * Fixed the postinst files in order to use 2.1 (instead of 2.0). + * Mention the immanent release of 2.0.1 and 2.1.1, with a GPL + compatible license. + + -- Gregor Hoffleit Sun, 17 Jun 2001 21:05:25 +0200 + +python2 (2.1-0) unstable; urgency=low + + * New upstream version. + * Experimental packages. + + -- Gregor Hoffleit Thu, 10 May 2001 00:20:04 +0200 + +python2 (2.0-7) unstable; urgency=low + + * Rebuilt with recent tcl8.3-dev/tk8.3-dev in order to fix a + dependency problem with python2-tk (closes: #87793, #92962). + * Change postinst to create and update /usr/local/lib/python2.0 and + site-python with permissions and owner as mandated by policy: + 2775 and root:staff (closes: #89047). + * Fix to compileall.py: A superfluous argument made compileall without + options fail (cf. #92990 for python). + * Move the distutils module into python2-dev. It needs Makefile.pre.in + in order to work (closes: #89900). + * Remove build-dependency on libgdbm2-dev (which isn't built anyway). + * Add a build-dependency on libdb2-dev (cf. #90220 for python). + + -- Gregor Hoffleit Sat, 14 Apr 2001 21:07:51 +0200 + +python2 (2.0-6) unstable; urgency=low + + * Remove python-zlib package; merge it into python-base. + * Mark that README.python2 is not yet updated. + + -- Gregor Hoffleit Wed, 21 Feb 2001 12:34:18 +0100 + +python2 (2.0-5) unstable; urgency=low + + * Recompile with tcl/tk8.3 (closes: #82088). + * Modifications to README.why-python2 (closes: #82116). + * Add menu hint to idle2 menu entry. + * idle2 is renamed idle-python2 and now build correctly (closes: #82218). + * Add build-dependency on autoconf (closes: #85339). + * Build bsddbmodule as shared module (Modules/Setup.config.in), + and link libpython2.so with -lm in Makefile (closes: #86027). + * various cleanups in debian/rules, e.g. removing dh_suidregister. + * Make pdb available as /usr/bin/pdb-python2 in python2-dev + (cf. #79870 in python-base). + * Remove libgmp3 from build-dependencies, since we currently can't + build the mpzmodule for Python2 due to license problems. + + -- Gregor Hoffleit Sun, 18 Feb 2001 00:12:17 +0100 + +python2 (2.0-4) unstable; urgency=low + + * control: make python2-elisp conflict with python-elisp (it doesn't + make sense to have both of them installed, does it ?) + * include build-depend on libxmltok1-dev. + * again, build with tcl/tk8.0. + + -- Gregor Hoffleit Wed, 10 Jan 2001 23:37:01 +0100 + +python2 (2.0-3) unstable; urgency=low + + * Modules/Setup.in: Added a missing \ that made _tkinter be built + incorrectly. + * rules: on the fly, change all '#!' python scripts to use python2. + + -- Gregor Hoffleit Wed, 13 Dec 2000 20:07:24 +0100 + +python2 (2.0-2) unstable; urgency=low + + * Aaargh. Remove conflicts/provides/replaces on python-base to make + parallel installation of python-base and python2-base possible. + * Install examples into /usr/share/doc/python2 (not python) and fix + symlink to python2.0 (thanks to Rick Younie for + pointing out this). + * Rename man page to python2.1. + + -- Gregor Hoffleit Wed, 13 Dec 2000 09:31:05 +0100 + +python2 (2.0-1) unstable; urgency=low + + * New upstream version. Initial release for python2. + + -- Gregor Hoffleit Mon, 11 Dec 2000 22:39:46 +0100 --- python3.4-3.4.1.orig/debian/changelog.shared +++ python3.4-3.4.1/debian/changelog.shared @@ -0,0 +1,3 @@ + * Link the interpreter against the shared runtime library. With + gcc-4.1 the difference in the pystones benchmark dropped from about + 12% to about 5%. --- python3.4-3.4.1.orig/debian/compat +++ python3.4-3.4.1/debian/compat @@ -0,0 +1 @@ +5 --- python3.4-3.4.1.orig/debian/control +++ python3.4-3.4.1/debian/control @@ -0,0 +1,214 @@ +Source: python3.4 +Section: python +Priority: optional +Maintainer: Matthias Klose +Build-Depends: debhelper (>= 5.0.51~), quilt, autoconf, lsb-release, sharutils, + libreadline6-dev, libncursesw5-dev (>= 5.3), gcc (>= 4:4.8.2-4), + zlib1g-dev, libbz2-dev, liblzma-dev, + libgdbm-dev, libdb-dev, + tk-dev, blt-dev (>= 2.4z), libssl-dev, + libexpat1-dev, libmpdec-dev (>= 2.4), + libbluetooth-dev [!hurd-i386 !kfreebsd-i386 !kfreebsd-amd64], + locales [!armel !avr32 !hppa !ia64 !mipsel], + libsqlite3-dev, libffi-dev (>= 3.0.5) [!or1k !avr32], + libgpm2 [!hurd-i386 !kfreebsd-i386 !kfreebsd-amd64], + mime-support, netbase, bzip2, python3:any, + net-tools, xvfb, xauth +Build-Depends-Indep: python-sphinx +Standards-Version: 3.9.5 +Vcs-Browser: https://code.launchpad.net/~doko/python/pkg3.4-debian +Vcs-Bzr: http://bazaar.launchpad.net/~doko/python/pkg3.4-debian +XS-Testsuite: autopkgtest + +Package: python3.4 +Architecture: any +Multi-Arch: allowed +Priority: optional +Depends: python3.4-minimal (= ${binary:Version}), libpython3.4-stdlib (= ${binary:Version}), mime-support, ${shlibs:Depends}, ${misc:Depends} +Suggests: python3.4-venv, python3.4-doc, binutils +Description: Interactive high-level object-oriented language (version 3.4) + Python is a high-level, interactive, object-oriented language. Its 3.4 version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + +Package: python3.4-venv +Architecture: any +Multi-Arch: allowed +Priority: optional +Depends: python3.4 (= ${binary:Version}), + python-setuptools-whl, python-pip-whl, ${shlibs:Depends}, ${misc:Depends} +Replaces: python3.4 (<< 3.4.1) +Description: Interactive high-level object-oriented language (pyvenv binary, version 3.4) + Python is a high-level, interactive, object-oriented language. Its 3.4 version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains the pyvenv-3.4 binary. + +Package: libpython3.4-stdlib +Architecture: any +Multi-Arch: same +Priority: optional +Pre-Depends: multiarch-support +Depends: libpython3.4-minimal (= ${binary:Version}), mime-support, ${shlibs:Depends}, ${misc:Depends} +Description: Interactive high-level object-oriented language (standard library, version 3.4) + Python is a high-level, interactive, object-oriented language. Its 3.4 version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains Python 3.4's standard library. It is normally not + used on its own, but as a dependency of python3.4. + +Package: python3.4-minimal +Architecture: any +Multi-Arch: allowed +Priority: optional +Pre-Depends: ${shlibs:Pre-Depends} +Depends: libpython3.4-minimal (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Recommends: python3.4 +Suggests: binfmt-support +Conflicts: binfmt-support (<< 1.1.2) +Description: Minimal subset of the Python language (version 3.4) + This package contains the interpreter and some essential modules. It can + be used in the boot process for some basic tasks. + See /usr/share/doc/python3.4-minimal/README.Debian for a list of the modules + contained in this package. + +Package: libpython3.4-minimal +Architecture: any +Multi-Arch: same +Priority: optional +Pre-Depends: multiarch-support +Depends: ${shlibs:Depends}, ${misc:Depends} +Recommends: libpython3.4-stdlib +Conflicts: binfmt-support (<< 1.1.2) +Replaces: libpython3.4-stdlib (<< 3.4.0+20140425-1) +Description: Minimal subset of the Python language (version 3.4) + This package contains some essential modules. It is normally not + used on it's own, but as a dependency of python3.4-minimal. + +Package: libpython3.4 +Architecture: any +Multi-Arch: same +Section: libs +Priority: optional +Pre-Depends: multiarch-support +Depends: libpython3.4-stdlib (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Description: Shared Python runtime library (version 3.4) + Python is a high-level, interactive, object-oriented language. Its 3.4 version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains the shared runtime library, normally not needed + for programs using the statically linked interpreter. + +Package: python3.4-examples +Architecture: all +Depends: python3.4 (>= ${source:Version}), ${misc:Depends} +Replaces: libpython3.4-testsuite (<< 3.4.1-8~) +Description: Examples for the Python language (v3.4) + Examples, Demos and Tools for Python (v3.4). These are files included in + the upstream Python distribution (v3.4). + +Package: python3.4-dev +Architecture: any +Multi-Arch: allowed +Depends: python3.4 (= ${binary:Version}), libpython3.4-dev (= ${binary:Version}), libpython3.4 (= ${binary:Version}), libexpat1-dev, ${shlibs:Depends}, ${misc:Depends} +Recommends: libc6-dev | libc-dev +Description: Header files and a static library for Python (v3.4) + Header files, a static library and development tools for building + Python (v3.4) modules, extending the Python interpreter or embedding + Python (v3.4) in applications. + . + Maintainers of Python packages should read README.maintainers. + +Package: libpython3.4-dev +Section: libdevel +Architecture: any +Multi-Arch: same +Pre-Depends: multiarch-support +Depends: libpython3.4-stdlib (= ${binary:Version}), libpython3.4 (= ${binary:Version}), libexpat1-dev, ${shlibs:Depends}, ${misc:Depends} +Recommends: libc6-dev | libc-dev +Description: Header files and a static library for Python (v3.4) + Header files, a static library and development tools for building + Python (v3.4) modules, extending the Python interpreter or embedding + Python (v3.4) in applications. + . + Maintainers of Python packages should read README.maintainers. + . + This package contains development files. It is normally not + used on it's own, but as a dependency of python3.4-dev. + +Package: libpython3.4-testsuite +Section: libdevel +Architecture: all +Depends: python3.4 (>= ${binary:Version}), ${misc:Depends}, net-tools +Suggests: python3-gdbm, python3-tk +Description: Testsuite for the Python standard library (v3.4) + The complete testsuite for the Python standard library. Note that + a subset is found in the libpython3.4-stdlib package, which should + be enough for other packages to use (please do not build-depend + on this package, but file a bug report to include additional + testsuite files in the libpython3.4-stdlib package). + +Package: idle-python3.4 +Architecture: all +Depends: python3.4, python3-tk, python3.4-tk, ${misc:Depends} +Enhances: python3.4 +Description: IDE for Python (v3.4) using Tkinter + IDLE is an Integrated Development Environment for Python (v3.4). + IDLE is written using Tkinter and therefore quite platform-independent. + +Package: python3.4-doc +Section: doc +Architecture: all +Depends: libjs-jquery, libjs-underscore, ${misc:Depends} +Suggests: python3.4 +Description: Documentation for the high-level object-oriented language Python (v3.4) + These is the official set of documentation for the interactive high-level + object-oriented language Python (v3.4). All documents are provided + in HTML format. The package consists of ten documents: + . + * What's New in Python3.4 + * Tutorial + * Python Library Reference + * Macintosh Module Reference + * Python Language Reference + * Extending and Embedding Python + * Python/C API Reference + * Installing Python Modules + * Documenting Python + * Distributing Python Modules + +Package: python3.4-dbg +Section: debug +Architecture: any +Multi-Arch: allowed +Priority: extra +Depends: python3.4 (= ${binary:Version}), libpython3.4-dbg (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Recommends: gdb +Suggests: python3-gdbm-dbg, python3-tk-dbg +Description: Debug Build of the Python Interpreter (version 3.4) + The package holds two things: + . + - A Python interpreter configured with --pydebug. Dynamically loaded modules + are searched as _d.so first. Third party extensions need a separate + build to be used by this interpreter. + - Debug information for standard python interpreter and extensions. + . + See the README.debug for more information. + +Package: libpython3.4-dbg +Section: debug +Architecture: any +Multi-Arch: same +Priority: extra +Pre-Depends: multiarch-support +Depends: libpython3.4-stdlib (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Description: Debug Build of the Python Interpreter (version 3.4) + The package holds two things: + . + - Extensions for a Python interpreter configured with --pydebug. + - Debug information for standard python extensions. + . + See the README.debug for more information. --- python3.4-3.4.1.orig/debian/control.in +++ python3.4-3.4.1/debian/control.in @@ -0,0 +1,214 @@ +Source: @PVER@ +Section: python +Priority: optional +Maintainer: Matthias Klose +Build-Depends: debhelper (>= 5.0.51~), quilt, autoconf, lsb-release, sharutils, + libreadline6-dev, libncursesw5-dev (>= 5.3), @bd_gcc@ + zlib1g-dev, libbz2-dev, liblzma-dev, + libgdbm-dev, libdb-dev, + tk-dev, blt-dev (>= 2.4z), libssl-dev, + libexpat1-dev, libmpdec-dev (>= 2.4), + libbluetooth-dev [!hurd-i386 !kfreebsd-i386 !kfreebsd-amd64], + locales [!armel !avr32 !hppa !ia64 !mipsel], + libsqlite3-dev, libffi-dev (>= 3.0.5) [!or1k !avr32], + libgpm2 [!hurd-i386 !kfreebsd-i386 !kfreebsd-amd64], + mime-support, netbase, bzip2, python3@bd_qual@, + net-tools, xvfb, xauth +Build-Depends-Indep: python-sphinx +Standards-Version: 3.9.5 +Vcs-Browser: https://code.launchpad.net/~doko/python/pkg@VER@-debian +Vcs-Bzr: http://bazaar.launchpad.net/~doko/python/pkg@VER@-debian +XS-Testsuite: autopkgtest + +Package: @PVER@ +Architecture: any +Multi-Arch: allowed +Priority: @PRIO@ +Depends: @PVER@-minimal (= ${binary:Version}), lib@PVER@-stdlib (= ${binary:Version}), mime-support, ${shlibs:Depends}, ${misc:Depends} +Suggests: @PVER@-venv, @PVER@-doc, binutils +Description: Interactive high-level object-oriented language (version @VER@) + Python is a high-level, interactive, object-oriented language. Its @VER@ version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + +Package: @PVER@-venv +Architecture: any +Multi-Arch: allowed +Priority: @PRIO@ +Depends: @PVER@ (= ${binary:Version}), + python-setuptools-whl, python-pip-whl, ${shlibs:Depends}, ${misc:Depends} +Replaces: python3.4 (<< 3.4.1) +Description: Interactive high-level object-oriented language (pyvenv binary, version @VER@) + Python is a high-level, interactive, object-oriented language. Its @VER@ version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains the pyvenv-@VER@ binary. + +Package: lib@PVER@-stdlib +Architecture: any +Multi-Arch: same +Priority: @PRIO@ +Pre-Depends: multiarch-support +Depends: lib@PVER@-minimal (= ${binary:Version}), mime-support, ${shlibs:Depends}, ${misc:Depends} +Description: Interactive high-level object-oriented language (standard library, version @VER@) + Python is a high-level, interactive, object-oriented language. Its @VER@ version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains Python @VER@'s standard library. It is normally not + used on its own, but as a dependency of python@VER@. + +Package: @PVER@-minimal +Architecture: any +Multi-Arch: allowed +Priority: @MINPRIO@ +Pre-Depends: ${shlibs:Pre-Depends} +Depends: lib@PVER@-minimal (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Recommends: @PVER@ +Suggests: binfmt-support +Conflicts: binfmt-support (<< 1.1.2) +Description: Minimal subset of the Python language (version @VER@) + This package contains the interpreter and some essential modules. It can + be used in the boot process for some basic tasks. + See /usr/share/doc/@PVER@-minimal/README.Debian for a list of the modules + contained in this package. + +Package: lib@PVER@-minimal +Architecture: any +Multi-Arch: same +Priority: @MINPRIO@ +Pre-Depends: multiarch-support +Depends: ${shlibs:Depends}, ${misc:Depends} +Recommends: lib@PVER@-stdlib +Conflicts: binfmt-support (<< 1.1.2) +Replaces: libpython3.4-stdlib (<< 3.4.0+20140425-1) +Description: Minimal subset of the Python language (version @VER@) + This package contains some essential modules. It is normally not + used on it's own, but as a dependency of @PVER@-minimal. + +Package: lib@PVER@ +Architecture: any +Multi-Arch: same +Section: libs +Priority: @PRIO@ +Pre-Depends: multiarch-support +Depends: lib@PVER@-stdlib (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Description: Shared Python runtime library (version @VER@) + Python is a high-level, interactive, object-oriented language. Its @VER@ version + includes an extensive class library with lots of goodies for + network programming, system administration, sounds and graphics. + . + This package contains the shared runtime library, normally not needed + for programs using the statically linked interpreter. + +Package: @PVER@-examples +Architecture: all +Depends: @PVER@ (>= ${source:Version}), ${misc:Depends} +Replaces: lib@PVER@-testsuite (<< 3.4.1-8~) +Description: Examples for the Python language (v@VER@) + Examples, Demos and Tools for Python (v@VER@). These are files included in + the upstream Python distribution (v@VER@). + +Package: @PVER@-dev +Architecture: any +Multi-Arch: allowed +Depends: @PVER@ (= ${binary:Version}), lib@PVER@-dev (= ${binary:Version}), lib@PVER@ (= ${binary:Version}), libexpat1-dev, ${shlibs:Depends}, ${misc:Depends} +Recommends: libc6-dev | libc-dev +Description: Header files and a static library for Python (v@VER@) + Header files, a static library and development tools for building + Python (v@VER@) modules, extending the Python interpreter or embedding + Python (v@VER@) in applications. + . + Maintainers of Python packages should read README.maintainers. + +Package: lib@PVER@-dev +Section: libdevel +Architecture: any +Multi-Arch: same +Pre-Depends: multiarch-support +Depends: lib@PVER@-stdlib (= ${binary:Version}), lib@PVER@ (= ${binary:Version}), libexpat1-dev, ${shlibs:Depends}, ${misc:Depends} +Recommends: libc6-dev | libc-dev +Description: Header files and a static library for Python (v@VER@) + Header files, a static library and development tools for building + Python (v@VER@) modules, extending the Python interpreter or embedding + Python (v@VER@) in applications. + . + Maintainers of Python packages should read README.maintainers. + . + This package contains development files. It is normally not + used on it's own, but as a dependency of @PVER@-dev. + +Package: lib@PVER@-testsuite +Section: libdevel +Architecture: all +Depends: @PVER@ (>= ${binary:Version}), ${misc:Depends}, net-tools +Suggests: python3-gdbm, python3-tk +Description: Testsuite for the Python standard library (v@VER@) + The complete testsuite for the Python standard library. Note that + a subset is found in the lib@PVER@-stdlib package, which should + be enough for other packages to use (please do not build-depend + on this package, but file a bug report to include additional + testsuite files in the lib@PVER@-stdlib package). + +Package: idle-@PVER@ +Architecture: all +Depends: @PVER@, python3-tk, @PVER@-tk, ${misc:Depends} +Enhances: @PVER@ +Description: IDE for Python (v@VER@) using Tkinter + IDLE is an Integrated Development Environment for Python (v@VER@). + IDLE is written using Tkinter and therefore quite platform-independent. + +Package: @PVER@-doc +Section: doc +Architecture: all +Depends: libjs-jquery, libjs-underscore, ${misc:Depends} +Suggests: @PVER@ +Description: Documentation for the high-level object-oriented language Python (v@VER@) + These is the official set of documentation for the interactive high-level + object-oriented language Python (v@VER@). All documents are provided + in HTML format. The package consists of ten documents: + . + * What's New in Python@VER@ + * Tutorial + * Python Library Reference + * Macintosh Module Reference + * Python Language Reference + * Extending and Embedding Python + * Python/C API Reference + * Installing Python Modules + * Documenting Python + * Distributing Python Modules + +Package: @PVER@-dbg +Section: debug +Architecture: any +Multi-Arch: allowed +Priority: extra +Depends: @PVER@ (= ${binary:Version}), lib@PVER@-dbg (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Recommends: gdb +Suggests: python3-gdbm-dbg, python3-tk-dbg +Description: Debug Build of the Python Interpreter (version @VER@) + The package holds two things: + . + - A Python interpreter configured with --pydebug. Dynamically loaded modules + are searched as _d.so first. Third party extensions need a separate + build to be used by this interpreter. + - Debug information for standard python interpreter and extensions. + . + See the README.debug for more information. + +Package: lib@PVER@-dbg +Section: debug +Architecture: any +Multi-Arch: same +Priority: extra +Pre-Depends: multiarch-support +Depends: lib@PVER@-stdlib (= ${binary:Version}), ${shlibs:Depends}, ${misc:Depends} +Description: Debug Build of the Python Interpreter (version @VER@) + The package holds two things: + . + - Extensions for a Python interpreter configured with --pydebug. + - Debug information for standard python extensions. + . + See the README.debug for more information. --- python3.4-3.4.1.orig/debian/control.stdlib +++ python3.4-3.4.1/debian/control.stdlib @@ -0,0 +1,16 @@ +Package: @PVER@-tk +Architecture: any +Depends: @PVER@ (= ${Source-Version}), ${shlibs:Depends} +Suggests: tix +XB-Python-Version: @VER@ +Description: Tkinter - Writing Tk applications with Python (v@VER@) + A module for writing portable GUI applications with Python (v@VER@) using Tk. + Also known as Tkinter. + +Package: @PVER@-gdbm +Architecture: any +Depends: @PVER@ (= ${Source-Version}), ${shlibs:Depends} +Description: GNU dbm database support for Python (v@VER@) + GNU dbm database module for Python. Install this if you want to + create or read GNU dbm database files with Python. + --- python3.4-3.4.1.orig/debian/control.udeb +++ python3.4-3.4.1/debian/control.udeb @@ -0,0 +1,11 @@ + +Package: @PVER@-udeb +XC-Package-Type: udeb +Section: debian-installer +Architecture: any +Depends: ${shlibs:Depends}, ${misc:Depends} +XB-Python-Runtime: @PVER@ +XB-Python-Version: @VER@ +Description: A minimal subset of the Python language (version @VER@) + This package contains the interpreter and some essential modules, packaged + for use in the installer. --- python3.4-3.4.1.orig/debian/copyright +++ python3.4-3.4.1/debian/copyright @@ -0,0 +1,1028 @@ +This package was put together by Klee Dienes from +sources from ftp.python.org:/pub/python, based on the Debianization by +the previous maintainers Bernd S. Brentrup and +Bruce Perens. Current maintainer is Matthias Klose . + +It was downloaded from http://python.org/ + +Copyright: + +Upstream Author: Guido van Rossum and others. + +License: + +The following text includes the Python license and licenses and +acknowledgements for incorporated software. The licenses can be read +in the HTML and texinfo versions of the documentation as well, after +installing the pythonx.y-doc package. Licenses for files not licensed +under the Python Licenses are found at the end of this file. + + +Python License +============== + +A. HISTORY OF THE SOFTWARE +========================== + +Python was created in the early 1990s by Guido van Rossum at Stichting +Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands +as a successor of a language called ABC. Guido remains Python's +principal author, although it includes many contributions from others. + +In 1995, Guido continued his work on Python at the Corporation for +National Research Initiatives (CNRI, see http://www.cnri.reston.va.us) +in Reston, Virginia where he released several versions of the +software. + +In May 2000, Guido and the Python core development team moved to +BeOpen.com to form the BeOpen PythonLabs team. In October of the same +year, the PythonLabs team moved to Digital Creations (now Zope +Corporation, see http://www.zope.com). In 2001, the Python Software +Foundation (PSF, see http://www.python.org/psf/) was formed, a +non-profit organization created specifically to own Python-related +Intellectual Property. Zope Corporation is a sponsoring member of +the PSF. + +All Python releases are Open Source (see http://www.opensource.org for +the Open Source Definition). Historically, most, but not all, Python +releases have also been GPL-compatible; the table below summarizes +the various releases. + + Release Derived Year Owner GPL- + from compatible? (1) + + 0.9.0 thru 1.2 1991-1995 CWI yes + 1.3 thru 1.5.2 1.2 1995-1999 CNRI yes + 1.6 1.5.2 2000 CNRI no + 2.0 1.6 2000 BeOpen.com no + 1.6.1 1.6 2001 CNRI yes (2) + 2.1 2.0+1.6.1 2001 PSF no + 2.0.1 2.0+1.6.1 2001 PSF yes + 2.1.1 2.1+2.0.1 2001 PSF yes + 2.2 2.1.1 2001 PSF yes + 2.1.2 2.1.1 2002 PSF yes + 2.1.3 2.1.2 2002 PSF yes + 2.2.1 2.2 2002 PSF yes + 2.2.2 2.2.1 2002 PSF yes + 2.2.3 2.2.2 2003 PSF yes + 2.3 2.2.2 2002-2003 PSF yes + 2.3.1 2.3 2002-2003 PSF yes + 2.3.2 2.3.1 2002-2003 PSF yes + 2.3.3 2.3.2 2002-2003 PSF yes + 2.3.4 2.3.3 2004 PSF yes + 2.3.5 2.3.4 2005 PSF yes + 2.4 2.3 2004 PSF yes + 2.4.1 2.4 2005 PSF yes + 2.4.2 2.4.1 2005 PSF yes + 2.4.3 2.4.2 2006 PSF yes + 2.5 2.4 2006 PSF yes + 2.5.1 2.5 2007 PSF yes + 2.5.2 2.5.1 2008 PSF yes + 2.5.3 2.5.2 2008 PSF yes + 2.6 2.5 2008 PSF yes + 2.6.1 2.6 2008 PSF yes + 2.6.2 2.6.1 2009 PSF yes + 2.6.3 2.6.2 2009 PSF yes + 2.6.4 2.6.3 2009 PSF yes + 2.6.5 2.6.4 2010 PSF yes + 3.0 2.6 2008 PSF yes + 3.0.1 3.0 2009 PSF yes + 3.1 3.0.1 2009 PSF yes + 3.1.1 3.1 2009 PSF yes + 3.1.2 3.1.1 2010 PSF yes + 3.1.3 3.1.2 2010 PSF yes + 3.1.4 3.1.3 2011 PSF yes + 3.2 3.1 2011 PSF yes + 3.2.1 3.2 2011 PSF yes + 3.2.2 3.2.1 2011 PSF yes + 3.3 3.2 2012 PSF yes + +Footnotes: + +(1) GPL-compatible doesn't mean that we're distributing Python under + the GPL. All Python licenses, unlike the GPL, let you distribute + a modified version without making your changes open source. The + GPL-compatible licenses make it possible to combine Python with + other software that is released under the GPL; the others don't. + +(2) According to Richard Stallman, 1.6.1 is not GPL-compatible, + because its license has a choice of law clause. According to + CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1 + is "not incompatible" with the GPL. + +Thanks to the many outside volunteers who have worked under Guido's +direction to make these releases possible. + + +B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON +=============================================================== + +PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2 +-------------------------------------------- + +1. This LICENSE AGREEMENT is between the Python Software Foundation +("PSF"), and the Individual or Organization ("Licensee") accessing and +otherwise using this software ("Python") in source or binary form and +its associated documentation. + +2. Subject to the terms and conditions of this License Agreement, PSF +hereby grants Licensee a nonexclusive, royalty-free, world-wide +license to reproduce, analyze, test, perform and/or display publicly, +prepare derivative works, distribute, and otherwise use Python alone +or in any derivative version, provided, however, that PSF's License +Agreement and PSF's notice of copyright, i.e., "Copyright (c) 2001, +2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, +2013, 2014 Python Software Foundation; All Rights Reserved" are +retained in Python alone or in any derivative version prepared by +Licensee. + +3. In the event Licensee prepares a derivative work that is based on +or incorporates Python or any part thereof, and wants to make +the derivative work available to others as provided herein, then +Licensee hereby agrees to include in any such work a brief summary of +the changes made to Python. + +4. PSF is making Python available to Licensee on an "AS IS" +basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR +IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND +DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS +FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT +INFRINGE ANY THIRD PARTY RIGHTS. + +5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON +FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS +A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON, +OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. + +6. This License Agreement will automatically terminate upon a material +breach of its terms and conditions. + +7. Nothing in this License Agreement shall be deemed to create any +relationship of agency, partnership, or joint venture between PSF and +Licensee. This License Agreement does not grant permission to use PSF +trademarks or trade name in a trademark sense to endorse or promote +products or services of Licensee, or any third party. + +8. By copying, installing or otherwise using Python, Licensee +agrees to be bound by the terms and conditions of this License +Agreement. + + +BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0 +------------------------------------------- + +BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1 + +1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an +office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the +Individual or Organization ("Licensee") accessing and otherwise using +this software in source or binary form and its associated +documentation ("the Software"). + +2. Subject to the terms and conditions of this BeOpen Python License +Agreement, BeOpen hereby grants Licensee a non-exclusive, +royalty-free, world-wide license to reproduce, analyze, test, perform +and/or display publicly, prepare derivative works, distribute, and +otherwise use the Software alone or in any derivative version, +provided, however, that the BeOpen Python License is retained in the +Software, alone or in any derivative version prepared by Licensee. + +3. BeOpen is making the Software available to Licensee on an "AS IS" +basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR +IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND +DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS +FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT +INFRINGE ANY THIRD PARTY RIGHTS. + +4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE +SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS +AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY +DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. + +5. This License Agreement will automatically terminate upon a material +breach of its terms and conditions. + +6. This License Agreement shall be governed by and interpreted in all +respects by the law of the State of California, excluding conflict of +law provisions. Nothing in this License Agreement shall be deemed to +create any relationship of agency, partnership, or joint venture +between BeOpen and Licensee. This License Agreement does not grant +permission to use BeOpen trademarks or trade names in a trademark +sense to endorse or promote products or services of Licensee, or any +third party. As an exception, the "BeOpen Python" logos available at +http://www.pythonlabs.com/logos.html may be used according to the +permissions granted on that web page. + +7. By copying, installing or otherwise using the software, Licensee +agrees to be bound by the terms and conditions of this License +Agreement. + + +CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1 +--------------------------------------- + +1. This LICENSE AGREEMENT is between the Corporation for National +Research Initiatives, having an office at 1895 Preston White Drive, +Reston, VA 20191 ("CNRI"), and the Individual or Organization +("Licensee") accessing and otherwise using Python 1.6.1 software in +source or binary form and its associated documentation. + +2. Subject to the terms and conditions of this License Agreement, CNRI +hereby grants Licensee a nonexclusive, royalty-free, world-wide +license to reproduce, analyze, test, perform and/or display publicly, +prepare derivative works, distribute, and otherwise use Python 1.6.1 +alone or in any derivative version, provided, however, that CNRI's +License Agreement and CNRI's notice of copyright, i.e., "Copyright (c) +1995-2001 Corporation for National Research Initiatives; All Rights +Reserved" are retained in Python 1.6.1 alone or in any derivative +version prepared by Licensee. Alternately, in lieu of CNRI's License +Agreement, Licensee may substitute the following text (omitting the +quotes): "Python 1.6.1 is made available subject to the terms and +conditions in CNRI's License Agreement. This Agreement together with +Python 1.6.1 may be located on the Internet using the following +unique, persistent identifier (known as a handle): 1895.22/1013. This +Agreement may also be obtained from a proxy server on the Internet +using the following URL: http://hdl.handle.net/1895.22/1013". + +3. In the event Licensee prepares a derivative work that is based on +or incorporates Python 1.6.1 or any part thereof, and wants to make +the derivative work available to others as provided herein, then +Licensee hereby agrees to include in any such work a brief summary of +the changes made to Python 1.6.1. + +4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS" +basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR +IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND +DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS +FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT +INFRINGE ANY THIRD PARTY RIGHTS. + +5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON +1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS +A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1, +OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. + +6. This License Agreement will automatically terminate upon a material +breach of its terms and conditions. + +7. This License Agreement shall be governed by the federal +intellectual property law of the United States, including without +limitation the federal copyright law, and, to the extent such +U.S. federal law does not apply, by the law of the Commonwealth of +Virginia, excluding Virginia's conflict of law provisions. +Notwithstanding the foregoing, with regard to derivative works based +on Python 1.6.1 that incorporate non-separable material that was +previously distributed under the GNU General Public License (GPL), the +law of the Commonwealth of Virginia shall govern this License +Agreement only as to issues arising under or with respect to +Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this +License Agreement shall be deemed to create any relationship of +agency, partnership, or joint venture between CNRI and Licensee. This +License Agreement does not grant permission to use CNRI trademarks or +trade name in a trademark sense to endorse or promote products or +services of Licensee, or any third party. + +8. By clicking on the "ACCEPT" button where indicated, or by copying, +installing or otherwise using Python 1.6.1, Licensee agrees to be +bound by the terms and conditions of this License Agreement. + + ACCEPT + + +CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2 +-------------------------------------------------- + +Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam, +The Netherlands. All rights reserved. + +Permission to use, copy, modify, and distribute this software and its +documentation for any purpose and without fee is hereby granted, +provided that the above copyright notice appear in all copies and that +both that copyright notice and this permission notice appear in +supporting documentation, and that the name of Stichting Mathematisch +Centrum or CWI not be used in advertising or publicity pertaining to +distribution of the software without specific, written prior +permission. + +STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO +THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND +FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE +FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT +OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + + +Licenses and Acknowledgements for Incorporated Software +======================================================= + +Mersenne Twister +---------------- + +The `_random' module includes code based on a download from +`http://www.math.keio.ac.jp/~matumoto/MT2002/emt19937ar.html'. The +following are the verbatim comments from the original code: + + A C-program for MT19937, with initialization improved 2002/1/26. + Coded by Takuji Nishimura and Makoto Matsumoto. + + Before using, initialize the state by using init_genrand(seed) + or init_by_array(init_key, key_length). + + Copyright (C) 1997 - 2002, Makoto Matsumoto and Takuji Nishimura, + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. + + 3. The names of its contributors may not be used to endorse or promote + products derived from this software without specific prior written + permission. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED + TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR + PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF + LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING + NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS + SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + + Any feedback is very welcome. + http://www.math.keio.ac.jp/matumoto/emt.html + email: matumoto@math.keio.ac.jp + + +Sockets +------- + +The `socket' module uses the functions, `getaddrinfo', and +`getnameinfo', which are coded in separate source files from the WIDE +Project, `http://www.wide.ad.jp/about/index.html'. + + Copyright (C) 1995, 1996, 1997, and 1998 WIDE Project. + All rights reserved. + + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. + 3. Neither the name of the project nor the names of its contributors + may be used to endorse or promote products derived from this software + without specific prior written permission. + + THIS SOFTWARE IS PROVIDED BY THE PROJECT AND CONTRIBUTORS ``AS IS'' AND + GAI_ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE + IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE + ARE DISCLAIMED. IN NO EVENT SHALL THE PROJECT OR CONTRIBUTORS BE LIABLE + FOR GAI_ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR + CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF + SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS + INTERRUPTION) HOWEVER CAUSED AND ON GAI_ANY THEORY OF LIABILITY, WHETHER + IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + ARISING IN GAI_ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED + OF THE POSSIBILITY OF SUCH DAMAGE. + + +Floating point exception control +-------------------------------- + +The source for the `fpectl' module includes the following notice: + + --------------------------------------------------------------------- + / Copyright (c) 1996. \ + | The Regents of the University of California. | + | All rights reserved. | + | | + | Permission to use, copy, modify, and distribute this software for | + | any purpose without fee is hereby granted, provided that this en- | + | tire notice is included in all copies of any software which is or | + | includes a copy or modification of this software and in all | + | copies of the supporting documentation for such software. | + | | + | This work was produced at the University of California, Lawrence | + | Livermore National Laboratory under contract no. W-7405-ENG-48 | + | between the U.S. Department of Energy and The Regents of the | + | University of California for the operation of UC LLNL. | + | | + | DISCLAIMER | + | | + | This software was prepared as an account of work sponsored by an | + | agency of the United States Government. Neither the United States | + | Government nor the University of California nor any of their em- | + | ployees, makes any warranty, express or implied, or assumes any | + | liability or responsibility for the accuracy, completeness, or | + | usefulness of any information, apparatus, product, or process | + | disclosed, or represents that its use would not infringe | + | privately-owned rights. Reference herein to any specific commer- | + | cial products, process, or service by trade name, trademark, | + | manufacturer, or otherwise, does not necessarily constitute or | + | imply its endorsement, recommendation, or favoring by the United | + | States Government or the University of California. The views and | + | opinions of authors expressed herein do not necessarily state or | + | reflect those of the United States Government or the University | + | of California, and shall not be used for advertising or product | + \ endorsement purposes. / + --------------------------------------------------------------------- + + +Cookie management +----------------- + +The `Cookie' module contains the following notice: + + Copyright 2000 by Timothy O'Malley + + All Rights Reserved + + Permission to use, copy, modify, and distribute this software + and its documentation for any purpose and without fee is hereby + granted, provided that the above copyright notice appear in all + copies and that both that copyright notice and this permission + notice appear in supporting documentation, and that the name of + Timothy O'Malley not be used in advertising or publicity + pertaining to distribution of the software without specific, written + prior permission. + + Timothy O'Malley DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS + SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY + AND FITNESS, IN NO EVENT SHALL Timothy O'Malley BE LIABLE FOR + ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES + WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, + WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS + ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR + PERFORMANCE OF THIS SOFTWARE. + + +Execution tracing +----------------- + +The `trace' module contains the following notice: + + portions copyright 2001, Autonomous Zones Industries, Inc., all rights... + err... reserved and offered to the public under the terms of the + Python 2.2 license. + Author: Zooko O'Whielacronx + http://zooko.com/ + mailto:zooko@zooko.com + + Copyright 2000, Mojam Media, Inc., all rights reserved. + Author: Skip Montanaro + + Copyright 1999, Bioreason, Inc., all rights reserved. + Author: Andrew Dalke + + Copyright 1995-1997, Automatrix, Inc., all rights reserved. + Author: Skip Montanaro + + Copyright 1991-1995, Stichting Mathematisch Centrum, all rights reserved. + + Permission to use, copy, modify, and distribute this Python software and + its associated documentation for any purpose without fee is hereby + granted, provided that the above copyright notice appears in all copies, + and that both that copyright notice and this permission notice appear in + supporting documentation, and that the name of neither Automatrix, + Bioreason or Mojam Media be used in advertising or publicity pertaining + to distribution of the software without specific, written prior + permission. + + +UUencode and UUdecode functions +------------------------------- + +The `uu' module contains the following notice: + + Copyright 1994 by Lance Ellinghouse + Cathedral City, California Republic, United States of America. + All Rights Reserved + Permission to use, copy, modify, and distribute this software and its + documentation for any purpose and without fee is hereby granted, + provided that the above copyright notice appear in all copies and that + both that copyright notice and this permission notice appear in + supporting documentation, and that the name of Lance Ellinghouse + not be used in advertising or publicity pertaining to distribution + of the software without specific, written prior permission. + LANCE ELLINGHOUSE DISCLAIMS ALL WARRANTIES WITH REGARD TO + THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND + FITNESS, IN NO EVENT SHALL LANCE ELLINGHOUSE CENTRUM BE LIABLE + FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES + WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN + ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT + OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + + Modified by Jack Jansen, CWI, July 1995: + - Use binascii module to do the actual line-by-line conversion + between ascii and binary. This results in a 1000-fold speedup. The C + version is still 5 times faster, though. + - Arguments more compliant with python standard + + +XML Remote Procedure Calls +-------------------------- + +The `xmlrpclib' module contains the following notice: + + The XML-RPC client interface is + + Copyright (c) 1999-2002 by Secret Labs AB + Copyright (c) 1999-2002 by Fredrik Lundh + + By obtaining, using, and/or copying this software and/or its + associated documentation, you agree that you have read, understood, + and will comply with the following terms and conditions: + + Permission to use, copy, modify, and distribute this software and + its associated documentation for any purpose and without fee is + hereby granted, provided that the above copyright notice appears in + all copies, and that both that copyright notice and this permission + notice appear in supporting documentation, and that the name of + Secret Labs AB or the author not be used in advertising or publicity + pertaining to distribution of the software without specific, written + prior permission. + + SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD + TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT- + ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR + BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY + DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, + WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS + ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE + OF THIS SOFTWARE. + +Licenses for Software linked to +=============================== + +Note that the choice of GPL compatibility outlined above doesn't extend +to modules linked to particular libraries, since they change the +effective License of the module binary. + + +GNU Readline +------------ + +The 'readline' module makes use of GNU Readline. + + The GNU Readline Library is free software; you can redistribute it + and/or modify it under the terms of the GNU General Public License as + published by the Free Software Foundation; either version 2, or (at + your option) any later version. + + On Debian systems, you can find the complete statement in + /usr/share/doc/readline-common/copyright'. A copy of the GNU General + Public License is available in /usr/share/common-licenses/GPL-2'. + + +OpenSSL +------- + +The '_ssl' module makes use of OpenSSL. + + The OpenSSL toolkit stays under a dual license, i.e. both the + conditions of the OpenSSL License and the original SSLeay license + apply to the toolkit. Actually both licenses are BSD-style Open + Source licenses. Note that both licenses are incompatible with + the GPL. + + On Debian systems, you can find the complete license text in + /usr/share/doc/openssl/copyright'. + + +Files with other licenses than the Python License +------------------------------------------------- + +Files: Include/dynamic_annotations.h +Files: Python/dynamic_annotations.c +Copyright: (c) 2008-2009, Google Inc. +License: Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions are + met: + + * Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + * Neither the name of Google Inc. nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS + "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT + LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR + A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT + OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, + SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT + LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, + DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY + THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT + (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE + OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + +Files: Include/unicodeobject.h +Copyright: (c) Corporation for National Research Initiatives. +Copyright: (c) 1999 by Secret Labs AB. +Copyright: (c) 1999 by Fredrik Lundh. +License: By obtaining, using, and/or copying this software and/or its + associated documentation, you agree that you have read, understood, + and will comply with the following terms and conditions: + + Permission to use, copy, modify, and distribute this software and its + associated documentation for any purpose and without fee is hereby + granted, provided that the above copyright notice appears in all + copies, and that both that copyright notice and this permission notice + appear in supporting documentation, and that the name of Secret Labs + AB or the author not be used in advertising or publicity pertaining to + distribution of the software without specific, written prior + permission. + + SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO + THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND + FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR BE LIABLE FOR + ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES + WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN + ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT + OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: Lib/logging/* +Copyright: 2001-2010 by Vinay Sajip. All Rights Reserved. +License: Permission to use, copy, modify, and distribute this software and + its documentation for any purpose and without fee is hereby granted, + provided that the above copyright notice appear in all copies and that + both that copyright notice and this permission notice appear in + supporting documentation, and that the name of Vinay Sajip + not be used in advertising or publicity pertaining to distribution + of the software without specific, written prior permission. + VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING + ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL + VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR + ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER + IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT + OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: Lib/multiprocessing/* +Files: Modules/_multiprocessing/* +Copyright: (c) 2006-2008, R Oudkerk. All rights reserved. +License: Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. + 3. Neither the name of author nor the names of any contributors may be + used to endorse or promote products derived from this software + without specific prior written permission. + + THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS "AS IS" AND + ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE + IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE + ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE + FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL + DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS + OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) + HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY + OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF + SUCH DAMAGE. + +Files: Lib/sqlite3/* +Files: Modules/_sqlite/* +Copyright: (C) 2004-2005 Gerhard Häring +License: This software is provided 'as-is', without any express or implied + warranty. In no event will the authors be held liable for any damages + arising from the use of this software. + + Permission is granted to anyone to use this software for any purpose, + including commercial applications, and to alter it and redistribute it + freely, subject to the following restrictions: + + 1. The origin of this software must not be misrepresented; you must not + claim that you wrote the original software. If you use this software + in a product, an acknowledgment in the product documentation would be + appreciated but is not required. + 2. Altered source versions must be plainly marked as such, and must not be + misrepresented as being the original software. + 3. This notice may not be removed or altered from any source distribution. + +Files: Lib/async* +Copyright: Copyright 1996 by Sam Rushing +License: Permission to use, copy, modify, and distribute this software and + its documentation for any purpose and without fee is hereby + granted, provided that the above copyright notice appear in all + copies and that both that copyright notice and this permission + notice appear in supporting documentation, and that the name of Sam + Rushing not be used in advertising or publicity pertaining to + distribution of the software without specific, written prior + permission. + + SAM RUSHING DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, + INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS, IN + NO EVENT SHALL SAM RUSHING BE LIABLE FOR ANY SPECIAL, INDIRECT OR + CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS + OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, + NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN + CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: Lib/tarfile.py +Copyright: (C) 2002 Lars Gustaebel +License: Permission is hereby granted, free of charge, to any person + obtaining a copy of this software and associated documentation + files (the "Software"), to deal in the Software without + restriction, including without limitation the rights to use, + copy, modify, merge, publish, distribute, sublicense, and/or sell + copies of the Software, and to permit persons to whom the + Software is furnished to do so, subject to the following + conditions: + + The above copyright notice and this permission notice shall be + included in all copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, + EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES + OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND + NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT + HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, + WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR + OTHER DEALINGS IN THE SOFTWARE. + +Files: Lib/turtle.py +Copyright: (C) 2006 - 2010 Gregor Lingl +License: This software is provided 'as-is', without any express or implied + warranty. In no event will the authors be held liable for any damages + arising from the use of this software. + + Permission is granted to anyone to use this software for any purpose, + including commercial applications, and to alter it and redistribute it + freely, subject to the following restrictions: + + 1. The origin of this software must not be misrepresented; you must not + claim that you wrote the original software. If you use this software + in a product, an acknowledgment in the product documentation would be + appreciated but is not required. + 2. Altered source versions must be plainly marked as such, and must not be + misrepresented as being the original software. + 3. This notice may not be removed or altered from any source distribution. + + is copyright Gregor Lingl and licensed under a BSD-like license + +Files: Modules/_ctypes/libffi/* +Copyright: Copyright (C) 1996-2011 Red Hat, Inc and others. + Copyright (C) 1996-2011 Anthony Green + Copyright (C) 1996-2010 Free Software Foundation, Inc + Copyright (c) 2003, 2004, 2006, 2007, 2008 Kaz Kojima + Copyright (c) 2010, 2011, Plausible Labs Cooperative , Inc. + Copyright (c) 2010 CodeSourcery + Copyright (c) 1998 Andreas Schwab + Copyright (c) 2000 Hewlett Packard Company + Copyright (c) 2009 Bradley Smith + Copyright (c) 2008 David Daney + Copyright (c) 2004 Simon Posnjak + Copyright (c) 2005 Axis Communications AB + Copyright (c) 1998 Cygnus Solutions + Copyright (c) 2004 Renesas Technology + Copyright (c) 2002, 2007 Bo Thorsen + Copyright (c) 2002 Ranjit Mathew + Copyright (c) 2002 Roger Sayle + Copyright (c) 2000, 2007 Software AG + Copyright (c) 2003 Jakub Jelinek + Copyright (c) 2000, 2001 John Hornkvist + Copyright (c) 1998 Geoffrey Keating + Copyright (c) 2008 Björn König + +License: Permission is hereby granted, free of charge, to any person obtaining + a copy of this software and associated documentation files (the + ``Software''), to deal in the Software without restriction, including + without limitation the rights to use, copy, modify, merge, publish, + distribute, sublicense, and/or sell copies of the Software, and to + permit persons to whom the Software is furnished to do so, subject to + the following conditions: + + The above copyright notice and this permission notice shall be included + in all copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED ``AS IS'', WITHOUT WARRANTY OF ANY KIND, + EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF + MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND + NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT + HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, + WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER + DEALINGS IN THE SOFTWARE. + + Documentation: + Permission is granted to copy, distribute and/or modify this document + under the terms of the GNU General Public License as published by the + Free Software Foundation; either version 2, or (at your option) any + later version. A copy of the license is included in the + section entitled ``GNU General Public License''. + +Files: Modules/_gestalt.c +Copyright: 1991-1997 by Stichting Mathematisch Centrum, Amsterdam. +License: Permission to use, copy, modify, and distribute this software and its + documentation for any purpose and without fee is hereby granted, + provided that the above copyright notice appear in all copies and that + both that copyright notice and this permission notice appear in + supporting documentation, and that the names of Stichting Mathematisch + Centrum or CWI not be used in advertising or publicity pertaining to + distribution of the software without specific, written prior permission. + + STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO + THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND + FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE + FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES + WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN + ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT + OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: Modules/syslogmodule.c +Copyright: 1994 by Lance Ellinghouse +License: Permission to use, copy, modify, and distribute this software and its + documentation for any purpose and without fee is hereby granted, + provided that the above copyright notice appear in all copies and that + both that copyright notice and this permission notice appear in + supporting documentation, and that the name of Lance Ellinghouse + not be used in advertising or publicity pertaining to distribution + of the software without specific, written prior permission. + + LANCE ELLINGHOUSE DISCLAIMS ALL WARRANTIES WITH REGARD TO + THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND + FITNESS, IN NO EVENT SHALL LANCE ELLINGHOUSE BE LIABLE FOR ANY SPECIAL, + INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING + FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, + NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION + WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: Modules/zlib/* +Copyright: (C) 1995-2010 Jean-loup Gailly and Mark Adler +License: This software is provided 'as-is', without any express or implied + warranty. In no event will the authors be held liable for any damages + arising from the use of this software. + + Permission is granted to anyone to use this software for any purpose, + including commercial applications, and to alter it and redistribute it + freely, subject to the following restrictions: + + 1. The origin of this software must not be misrepresented; you must not + claim that you wrote the original software. If you use this software + in a product, an acknowledgment in the product documentation would be + appreciated but is not required. + 2. Altered source versions must be plainly marked as such, and must not be + misrepresented as being the original software. + 3. This notice may not be removed or altered from any source distribution. + + Jean-loup Gailly Mark Adler + jloup@gzip.org madler@alumni.caltech.edu + + If you use the zlib library in a product, we would appreciate *not* receiving + lengthy legal documents to sign. The sources are provided for free but without + warranty of any kind. The library has been entirely written by Jean-loup + Gailly and Mark Adler; it does not include third-party code. + +Files: Modules/expat/* +Copyright: Copyright (c) 1998, 1999, 2000 Thai Open Source Software Center Ltd + and Clark Cooper + Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006 Expat maintainers +License: Permission is hereby granted, free of charge, to any person obtaining + a copy of this software and associated documentation files (the + "Software"), to deal in the Software without restriction, including + without limitation the rights to use, copy, modify, merge, publish, + distribute, sublicense, and/or sell copies of the Software, and to + permit persons to whom the Software is furnished to do so, subject to + the following conditions: + + The above copyright notice and this permission notice shall be included + in all copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, + EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF + MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. + IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY + CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, + TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE + SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + +Files: Modules/_decimal/libmpdec/* +Copyright: Copyright (c) 2008-2012 Stefan Krah. All rights reserved. +License: Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions + are met: + . + 1. Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + . + 2. Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. + , + THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS "AS IS" AND + ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE + IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE + ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE + FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL + DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS + OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) + HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT + LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY + OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF + SUCH DAMAGE. + +Files: Misc/python-mode.el +Copyright: Copyright (C) 1992,1993,1994 Tim Peters +License: This software is provided as-is, without express or implied + warranty. Permission to use, copy, modify, distribute or sell this + software, without fee, for any purpose and by any individual or + organization, is hereby granted, provided that the above copyright + notice and this paragraph appear in all copies. + +Files: Python/dtoa.c +Copyright: (c) 1991, 2000, 2001 by Lucent Technologies. +License: Permission to use, copy, modify, and distribute this software for any + purpose without fee is hereby granted, provided that this entire notice + is included in all copies of any software which is or includes a copy + or modification of this software and in all copies of the supporting + documentation for such software. + + THIS SOFTWARE IS BEING PROVIDED "AS IS", WITHOUT ANY EXPRESS OR IMPLIED + WARRANTY. IN PARTICULAR, NEITHER THE AUTHOR NOR LUCENT MAKES ANY + REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY + OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE. + +Files: Python/getopt.c +Copyright: 1992-1994, David Gottner +License: Permission to use, copy, modify, and distribute this software and its + documentation for any purpose and without fee is hereby granted, + provided that the above copyright notice, this permission notice and + the following disclaimer notice appear unmodified in all copies. + + I DISCLAIM ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING ALL + IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL I + BE LIABLE FOR ANY SPECIAL, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY + DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA, OR PROFITS, WHETHER + IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT + OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: PC/_subprocess.c +Copyright: Copyright (c) 2004 by Fredrik Lundh + Copyright (c) 2004 by Secret Labs AB, http://www.pythonware.com + Copyright (c) 2004 by Peter Astrand +License: + * Permission to use, copy, modify, and distribute this software and + * its associated documentation for any purpose and without fee is + * hereby granted, provided that the above copyright notice appears in + * all copies, and that both that copyright notice and this permission + * notice appear in supporting documentation, and that the name of the + * authors not be used in advertising or publicity pertaining to + * distribution of the software without specific, written prior + * permission. + * + * THE AUTHORS DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, + * INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. + * IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY SPECIAL, INDIRECT OR + * CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS + * OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, + * NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION + * WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. + +Files: PC/winsound.c +Copyright: Copyright (c) 1999 Toby Dickenson +License: * Permission to use this software in any way is granted without + * fee, provided that the copyright notice above appears in all + * copies. This software is provided "as is" without any warranty. + */ + +/* Modified by Guido van Rossum */ +/* Beep added by Mark Hammond */ +/* Win9X Beep and platform identification added by Uncle Timmy */ + +Files: Tools/pybench/* +Copyright: (c), 1997-2006, Marc-Andre Lemburg (mal@lemburg.com) + (c), 2000-2006, eGenix.com Software GmbH (info@egenix.com) +License: Permission to use, copy, modify, and distribute this software and its + documentation for any purpose and without fee or royalty is hereby + granted, provided that the above copyright notice appear in all copies + and that both that copyright notice and this permission notice appear + in supporting documentation or portions thereof, including + modifications, that you make. + + THE AUTHOR MARC-ANDRE LEMBURG DISCLAIMS ALL WARRANTIES WITH REGARD TO + THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND + FITNESS, IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, + INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING + FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, + NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION + WITH THE USE OR PERFORMANCE OF THIS SOFTWARE ! --- python3.4-3.4.1.orig/debian/depgraph.py +++ python3.4-3.4.1/debian/depgraph.py @@ -0,0 +1,199 @@ +#! /usr/bin/python3 + +# Copyright 2004 Toby Dickenson +# +# Permission is hereby granted, free of charge, to any person obtaining +# a copy of this software and associated documentation files (the +# "Software"), to deal in the Software without restriction, including +# without limitation the rights to use, copy, modify, merge, publish, +# distribute, sublicense, and/or sell copies of the Software, and to +# permit persons to whom the Software is furnished to do so, subject +# to the following conditions: +# +# The above copyright notice and this permission notice shall be included +# in all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + + +import sys, getopt, colorsys, imp, hashlib + +class pydepgraphdot: + + def main(self,argv): + opts,args = getopt.getopt(argv,'',['mono']) + self.colored = 1 + for o,v in opts: + if o=='--mono': + self.colored = 0 + self.render() + + def fix(self,s): + # Convert a module name to a syntactically correct node name + return s.replace('.','_') + + def render(self): + p,t = self.get_data() + + # normalise our input data + for k,d in list(p.items()): + for v in list(d.keys()): + if v not in p: + p[v] = {} + + f = self.get_output_file() + + f.write('digraph G {\n') + #f.write('concentrate = true;\n') + #f.write('ordering = out;\n') + f.write('ranksep=1.0;\n') + f.write('node [style=filled,fontname=Helvetica,fontsize=10];\n') + allkd = list(p.items()) + allkd.sort() + for k,d in allkd: + tk = t.get(k) + if self.use(k,tk): + allv = list(d.keys()) + allv.sort() + for v in allv: + tv = t.get(v) + if self.use(v,tv) and not self.toocommon(v,tv): + f.write('%s -> %s' % ( self.fix(k),self.fix(v) ) ) + self.write_attributes(f,self.edge_attributes(k,v)) + f.write(';\n') + f.write(self.fix(k)) + self.write_attributes(f,self.node_attributes(k,tk)) + f.write(';\n') + f.write('}\n') + + def write_attributes(self,f,a): + if a: + f.write(' [') + f.write(','.join(a)) + f.write(']') + + def node_attributes(self,k,type): + a = [] + a.append('label="%s"' % self.label(k)) + if self.colored: + a.append('fillcolor="%s"' % self.color(k,type)) + else: + a.append('fillcolor=white') + if self.toocommon(k,type): + a.append('peripheries=2') + return a + + def edge_attributes(self,k,v): + a = [] + weight = self.weight(k,v) + if weight!=1: + a.append('weight=%d' % weight) + length = self.alien(k,v) + if length: + a.append('minlen=%d' % length) + return a + + def get_data(self): + t = eval(sys.stdin.read()) + return t['depgraph'],t['types'] + + def get_output_file(self): + return sys.stdout + + def use(self,s,type): + # Return true if this module is interesting and should be drawn. Return false + # if it should be completely omitted. This is a default policy - please override. + if s=='__main__': + return 0 + #if s in ('os','sys','time','__future__','types','re','string'): + if s in ('sys'): + # nearly all modules use all of these... more or less. They add nothing to + # our diagram. + return 0 + if s.startswith('encodings.'): + return 0 + if self.toocommon(s,type): + # A module where we dont want to draw references _to_. Dot doesnt handle these + # well, so it is probably best to not draw them at all. + return 0 + return 1 + + def toocommon(self,s,type): + # Return true if references to this module are uninteresting. Such references + # do not get drawn. This is a default policy - please override. + # + if s=='__main__': + # references *to* __main__ are never interesting. omitting them means + # that main floats to the top of the page + return 1 + #if type==imp.PKG_DIRECTORY: + # # dont draw references to packages. + # return 1 + return 0 + + def weight(self,a,b): + # Return the weight of the dependency from a to b. Higher weights + # usually have shorter straighter edges. Return 1 if it has normal weight. + # A value of 4 is usually good for ensuring that a related pair of modules + # are drawn next to each other. This is a default policy - please override. + # + if b.split('.')[-1].startswith('_'): + # A module that starts with an underscore. You need a special reason to + # import these (for example random imports _random), so draw them close + # together + return 4 + return 1 + + def alien(self,a,b): + # Return non-zero if references to this module are strange, and should be drawn + # extra-long. the value defines the length, in rank. This is also good for putting some + # vertical space between seperate subsystems. This is a default policy - please override. + # + return 0 + + def label(self,s): + # Convert a module name to a formatted node label. This is a default policy - please override. + # + return '\\.\\n'.join(s.split('.')) + + def color(self,s,type): + # Return the node color for this module name. This is a default policy - please override. + # + # Calculate a color systematically based on the hash of the module name. Modules in the + # same package have the same color. Unpackaged modules are grey + t = self.normalise_module_name_for_hash_coloring(s,type) + return self.color_from_name(t) + + def normalise_module_name_for_hash_coloring(self,s,type): + if type==imp.PKG_DIRECTORY: + return s + else: + i = s.rfind('.') + if i<0: + return '' + else: + return s[:i] + + def color_from_name(self,name): + n = hashlib.md5(name.encode('utf-8')).digest() + hf = float(n[0]+n[1]*0xff)/0xffff + sf = float(n[2])/0xff + vf = float(n[3])/0xff + r,g,b = colorsys.hsv_to_rgb(hf, 0.3+0.6*sf, 0.8+0.2*vf) + return '#%02x%02x%02x' % (r*256,g*256,b*256) + + +def main(): + pydepgraphdot().main(sys.argv[1:]) + +if __name__=='__main__': + main() + + + --- python3.4-3.4.1.orig/debian/dh_doclink +++ python3.4-3.4.1/debian/dh_doclink @@ -0,0 +1,28 @@ +#! /bin/sh + +pkg=`echo $1 | sed 's/^-p//'` +target=$2 + +ln -sf $target debian/$pkg/usr/share/doc/$pkg + +f=debian/$pkg.postinst.debhelper +if [ ! -e $f ] || [ "`grep -c '^# dh_doclink' $f`" -eq 0 ]; then +cat >> $f <> $f <. +# + +set -e + +DIRLIST="/usr/lib/python@VER@/idlelib" + +case "$1" in + configure|abort-upgrade|abort-remove|abort-deconfigure) + + for i in $DIRLIST ; do + @PVER@ /usr/lib/@PVER@/compileall.py -q $i + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config + then + @PVER@ -O /usr/lib/@PVER@/compileall.py -q $i + fi + done + ;; + + *) + echo "postinst called with unknown argument \`$1'" >&2 + exit 1 + ;; + +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/idle-PVER.postrm.in +++ python3.4-3.4.1/debian/idle-PVER.postrm.in @@ -0,0 +1,11 @@ +#! /bin/sh + +set -e + +if [ "$1" = "purge" ]; then + rm -rf /etc/idle-@PVER@ +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/idle-PVER.prerm.in +++ python3.4-3.4.1/debian/idle-PVER.prerm.in @@ -0,0 +1,33 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars=$max echo \ + | while read files; do rm -f $files; done + find /usr/lib/@PVER@ -name dist-packages -prune -o -name __pycache__ -empty -print \ + | xargs -r rm -rf +} + +case "$1" in + remove|upgrade) + remove_bytecode idle-@PVER@ + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/idle.desktop.in +++ python3.4-3.4.1/debian/idle.desktop.in @@ -0,0 +1,9 @@ +[Desktop Entry] +Name=IDLE (using Python-@VER@) +Comment=Integrated Development Environment for Python (using Python-@VER@) +Exec=/usr/bin/idle-@PVER@ +Icon=/usr/share/pixmaps/@PVER@.xpm +Terminal=false +Type=Application +Categories=Application;Development; +StartupNotify=true --- python3.4-3.4.1.orig/debian/libPVER-dbg.overrides.in +++ python3.4-3.4.1/debian/libPVER-dbg.overrides.in @@ -0,0 +1,13 @@ +lib@PVER@-dbg binary: package-name-doesnt-match-sonames +lib@PVER@-dbg binary: non-dev-pkg-with-shlib-symlink + +# no, it's not unusual +lib@PVER@-dbg binary: unusual-interpreter + +# just the gdb debug file +lib@PVER@-dbg binary: python-script-but-no-python-dep + +# pointless lintian ... +lib@PVER@-dbg binary: hardening-no-fortify-functions + +lib@PVER@-dbg binary: arch-dependent-file-not-in-arch-specific-directory --- python3.4-3.4.1.orig/debian/libPVER-dbg.prerm.in +++ python3.4-3.4.1/debian/libPVER-dbg.prerm.in @@ -0,0 +1,23 @@ +#! /bin/sh + +set -e + +case "$1" in + remove) + dpkg -L lib@PVER@-dbg@HOST_QUAL@ \ + | awk '/\.py$/ {print $0"c\n" $0"o"}' \ + | xargs -r rm -f >&2 + ;; + upgrade) + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/libPVER-dbg.symbols.i386.in +++ python3.4-3.4.1/debian/libPVER-dbg.symbols.i386.in @@ -0,0 +1,30 @@ +libpython@VER@dm.so.1.0 libpython@VER@-dbg #MINVER# +#include "libpython.symbols" + _Py_force_double@Base @SVER@ + _Py_get_387controlword@Base @SVER@ + _Py_set_387controlword@Base @SVER@ + _PyDict_Dummy@Base @SVER@ + _PyObject_DebugMallocStats@Base @SVER@ + _PySet_Dummy@Base @SVER@ + _PyUnicode_CheckConsistency@Base @SVER@ + _PyUnicode_Dump@Base @SVER@ + _PyUnicode_compact_data@Base @SVER@ + _PyUnicode_data@Base @SVER@ + _PyUnicode_utf8@Base @SVER@ + _Py_AddToAllObjects@Base @SVER@ + _Py_Dealloc@Base @SVER@ + _Py_ForgetReference@Base @SVER@ + _Py_GetObjects@Base @SVER@ + _Py_GetRefTotal@Base @SVER@ + _Py_HashSecret_Initialized@Base @SVER@ + _Py_NegativeRefcount@Base @SVER@ + _Py_NewReference@Base @SVER@ + _Py_PrintReferenceAddresses@Base @SVER@ + _Py_PrintReferences@Base @SVER@ + _Py_RefTotal@Base @SVER@ + _Py_dumptree@Base @SVER@ + _Py_hashtable_print_stats@Base @SVER@ + _Py_printtree@Base @SVER@ + _Py_showtree@Base @SVER@ + _Py_tok_dump@Base @SVER@ + PyModule_Create2TraceRefs@Base @SVER@ --- python3.4-3.4.1.orig/debian/libPVER-dbg.symbols.in +++ python3.4-3.4.1/debian/libPVER-dbg.symbols.in @@ -0,0 +1,27 @@ +libpython@VER@dm.so.1.0 libpython@VER@-dbg #MINVER# +#include "libpython.symbols" + _PyDict_Dummy@Base @SVER@ + _PyObject_DebugMallocStats@Base @SVER@ + _PySet_Dummy@Base @SVER@ + _PyUnicode_CheckConsistency@Base @SVER@ + _PyUnicode_Dump@Base @SVER@ + _PyUnicode_compact_data@Base @SVER@ + _PyUnicode_data@Base @SVER@ + _PyUnicode_utf8@Base @SVER@ + _Py_AddToAllObjects@Base @SVER@ + _Py_Dealloc@Base @SVER@ + _Py_ForgetReference@Base @SVER@ + _Py_GetObjects@Base @SVER@ + _Py_GetRefTotal@Base @SVER@ + _Py_HashSecret_Initialized@Base @SVER@ + _Py_NegativeRefcount@Base @SVER@ + _Py_NewReference@Base @SVER@ + _Py_PrintReferenceAddresses@Base @SVER@ + _Py_PrintReferences@Base @SVER@ + _Py_RefTotal@Base @SVER@ + _Py_dumptree@Base @SVER@ + _Py_hashtable_print_stats@Base @SVER@ + _Py_printtree@Base @SVER@ + _Py_showtree@Base @SVER@ + _Py_tok_dump@Base @SVER@ + PyModule_Create2TraceRefs@Base @SVER@ --- python3.4-3.4.1.orig/debian/libPVER-dev.overrides.in +++ python3.4-3.4.1/debian/libPVER-dev.overrides.in @@ -0,0 +1,3 @@ +lib@PVER@-dev binary: python-script-but-no-python-dep + +lib@PVER@-dev binary: arch-dependent-file-not-in-arch-specific-directory --- python3.4-3.4.1.orig/debian/libPVER-minimal.overrides.in +++ python3.4-3.4.1/debian/libPVER-minimal.overrides.in @@ -0,0 +1,5 @@ +# intentional +lib@PVER@-minimal binary: python-script-but-no-python-dep + +# lintian omission, multiarch string is encoded in the filename +lib@PVER@-minimal binary: arch-dependent-file-not-in-arch-specific-directory --- python3.4-3.4.1.orig/debian/libPVER-minimal.postinst.in +++ python3.4-3.4.1/debian/libPVER-minimal.postinst.in @@ -0,0 +1,13 @@ +#! /bin/sh + +set -e + +if [ ! -f /etc/@PVER@/sitecustomize.py ]; then + cat <<-EOF + # Empty sitecustomize.py to avoid a dangling symlink +EOF +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/libPVER-minimal.postrm.in +++ python3.4-3.4.1/debian/libPVER-minimal.postrm.in @@ -0,0 +1,21 @@ +#! /bin/sh + +set -e + +if [ "$1" = "purge" ]; then + pc=$(dpkg-query -f '${db:Status-Abbrev} ${binary:Package}\n' -W pkgname \ + | grep -v '^.n' | wc -l) + if [ "$pc" -lt 1 ]; then + find /usr/lib/@PVER@ -depth -type d -name __pycache__ \ + | xargs -r rm -rf + rm -f /etc/@PVER@/sitecustomize.py + rm -rf /etc/@PVER@/__pycache__ + if [ -d /etc/@PVER@ ]; then + rmdir --ignore-fail-on-non-empty /etc/@PVER@ + fi + fi +fi + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/libPVER-minimal.prerm.in +++ python3.4-3.4.1/debian/libPVER-minimal.prerm.in @@ -0,0 +1,41 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars="$max" echo \ + | while read files; do rm -f $files; done + find /usr/lib/python3 /usr/lib/@PVER@ \ + \( -name dist-packages -prune \) -o \ + \( -name __pycache__ -type d -empty -print \) \ + | xargs -r rm -rf +} + +case "$1" in + remove) + pc=$(dpkg-query -f '${db:Status-Abbrev} ${binary:Package}\n' -W pkgname \ + | grep -v '^.n' | wc -l) + if [ "$pc" -le 1 ]; then + remove_bytecode lib@PVER@-minimal@HOST_QUAL@ + fi + ;; + upgrade) + remove_bytecode lib@PVER@-minimal@HOST_QUAL@ + # byte compilation in @PVER@-minimal postinst, strict dependency + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/libPVER-stdlib.overrides.in +++ python3.4-3.4.1/debian/libPVER-stdlib.overrides.in @@ -0,0 +1,15 @@ +# idlelib images +lib@PVER@-stdlib binary: image-file-in-usr-lib + +# license file referred by the standard library +lib@PVER@-stdlib binary: extra-license-file + +# template files +lib@PVER@-stdlib binary: interpreter-not-absolute usr/lib/@PVER@/venv/scripts/posix/pydoc #!__VENV_PYTHON__ +lib@PVER@-stdlib binary: unusual-interpreter usr/lib/@PVER@/venv/scripts/posix/pydoc #!__VENV_PYTHON__ + +# the split is the reason for that +lib@PVER@-stdlib binary: python-script-but-no-python-dep + +# lintian omission, multiarch string is encoded in the filename +lib@PVER@-stdlib binary: arch-dependent-file-not-in-arch-specific-directory --- python3.4-3.4.1.orig/debian/libPVER-stdlib.prerm.in +++ python3.4-3.4.1/debian/libPVER-stdlib.prerm.in @@ -0,0 +1,41 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars="$max" echo \ + | while read files; do rm -f $files; done + find /usr/lib/python3 /usr/lib/@PVER@ \ + \( -name dist-packages -prune \) -o \ + \( -name __pycache__ -type d -empty -print \) \ + | xargs -r rm -rf +} + +case "$1" in + remove) + pc=$(dpkg-query -f '${db:Status-Abbrev} ${binary:Package}\n' -W pkgname \ + | grep -v '^.n' | wc -l) + if [ "$pc" -le 1 ]; then + remove_bytecode lib@PVER@-stdlib@HOST_QUAL@ + fi + ;; + upgrade) + remove_bytecode lib@PVER@-stdlib@HOST_QUAL@ + # byte compilation in @PVER@ postinst, strict dependency + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/libPVER-testsuite.overrides.in +++ python3.4-3.4.1/debian/libPVER-testsuite.overrides.in @@ -0,0 +1,2 @@ +lib@PVER@-testsuite binary: python-script-but-no-python-dep +lib@PVER@-testsuite binary: image-file-in-usr-lib --- python3.4-3.4.1.orig/debian/libPVER-testsuite.postinst.in +++ python3.4-3.4.1/debian/libPVER-testsuite.postinst.in @@ -0,0 +1,20 @@ +#! /bin/sh + +set -e + +case "$1" in + configure) + files=$(dpkg -L lib@PVER@-testsuite | sed -n '/^\/usr\/lib\/@PVER@\/.*\.py$/p' | egrep -v '/lib2to3/tests/data|/test/bad') + if [ -n "$files" ]; then + @PVER@ -E -S /usr/lib/@PVER@/py_compile.py $files + if grep -sq '^byte-compile[^#]*optimize' /etc/python/debian_config; then + @PVER@ -E -S -O /usr/lib/@PVER@/py_compile.py $files + fi + else + echo >&2 "@PVER@: can't get files for byte-compilation" + fi +esac + +#DEBHELPER# + +exit 0 --- python3.4-3.4.1.orig/debian/libPVER-testsuite.prerm.in +++ python3.4-3.4.1/debian/libPVER-testsuite.prerm.in @@ -0,0 +1,36 @@ +#! /bin/sh + +set -e + +remove_bytecode() +{ + pkg=$1 + max=$(LANG=C LC_ALL=C xargs --show-limits < /dev/null 2>&1 | awk '/Maximum/ {print int($NF / 4)}') + dpkg -L $pkg \ + | awk -F/ 'BEGIN {OFS="/"} /\.py$/ {$NF=sprintf("__pycache__/%s.*.py[co]", substr($NF,1,length($NF)-3)); print}' \ + | xargs --max-chars="$max" echo \ + | while read files; do rm -f $files; done + + find /usr/lib/@PVER@ \ + -name __pycache__ -type d -empty -print \ + | xargs -r rm -rf +} + +case "$1" in + remove) + remove_bytecode lib@PVER@-testsuite + ;; + upgrade) + remove_bytecode lib@PVER@-testsuite + ;; + deconfigure) + ;; + failed-upgrade) + ;; + *) + echo "prerm called with unknown argument \`$1'" >&2 + exit 1 + ;; +esac + +#DEBHELPER# --- python3.4-3.4.1.orig/debian/libPVER.overrides.in +++ python3.4-3.4.1/debian/libPVER.overrides.in @@ -0,0 +1 @@ +lib@PVER@ binary: package-name-doesnt-match-sonames --- python3.4-3.4.1.orig/debian/libPVER.symbols.i386.in +++ python3.4-3.4.1/debian/libPVER.symbols.i386.in @@ -0,0 +1,8 @@ +libpython@VER@m.so.1.0 libpython@VER@ #MINVER# +#include "libpython.symbols" + PyModule_Create2@Base @SVER@ + _Py_force_double@Base @SVER@ + _Py_get_387controlword@Base @SVER@ + _Py_set_387controlword@Base @SVER@ + + (optional)__gnu_lto_v1@Base @SVER@ --- python3.4-3.4.1.orig/debian/libPVER.symbols.in +++ python3.4-3.4.1/debian/libPVER.symbols.in @@ -0,0 +1,5 @@ +libpython@VER@m.so.1.0 libpython@VER@ #MINVER# +#include "libpython.symbols" + PyModule_Create2@Base @SVER@ + + (optional)__gnu_lto_v1@Base @SVER@ --- python3.4-3.4.1.orig/debian/libPVER.symbols.lpia.in +++ python3.4-3.4.1/debian/libPVER.symbols.lpia.in @@ -0,0 +1,6 @@ +libpython@VER@m.so.1.0 libpython@VER@ #MINVER# +#include "libpython.symbols" + PyModule_Create2@Base @SVER@ + _Py_force_double@Base @SVER@ + _Py_get_387controlword@Base @SVER@ + _Py_set_387controlword@Base @SVER@ --- python3.4-3.4.1.orig/debian/libpython.symbols.in +++ python3.4-3.4.1/debian/libpython.symbols.in @@ -0,0 +1,1565 @@ + PyAST_Check@Base @SVER@ + PyAST_Compile@Base @SVER@ + PyAST_CompileEx@Base @SVER@ + PyAST_CompileObject@Base @SVER@ + PyAST_FromNode@Base @SVER@ + PyAST_FromNodeObject@Base @SVER@ + PyAST_Validate@Base @SVER@ + PyAST_mod2obj@Base @SVER@ + PyAST_obj2mod@Base @SVER@ + PyArena_AddPyObject@Base @SVER@ + PyArena_Free@Base @SVER@ + PyArena_Malloc@Base @SVER@ + PyArena_New@Base @SVER@ + PyArg_Parse@Base @SVER@ + PyArg_ParseTuple@Base @SVER@ + PyArg_ParseTupleAndKeywords@Base @SVER@ + PyArg_UnpackTuple@Base @SVER@ + PyArg_VaParse@Base @SVER@ + PyArg_VaParseTupleAndKeywords@Base @SVER@ + PyArg_ValidateKeywordArguments@Base @SVER@ + PyBaseObject_Type@Base @SVER@ + PyBool_FromLong@Base @SVER@ + PyBool_Type@Base @SVER@ + PyBuffer_FillContiguousStrides@Base @SVER@ + PyBuffer_FillInfo@Base @SVER@ + PyBuffer_FromContiguous@Base @SVER@ + PyBuffer_GetPointer@Base @SVER@ + PyBuffer_IsContiguous@Base @SVER@ + PyBuffer_Release@Base @SVER@ + PyBuffer_ToContiguous@Base @SVER@ + PyBufferedIOBase_Type@Base @SVER@ + PyBufferedRWPair_Type@Base @SVER@ + PyBufferedRandom_Type@Base @SVER@ + PyBufferedReader_Type@Base @SVER@ + PyBufferedWriter_Type@Base @SVER@ + PyByteArrayIter_Type@Base @SVER@ + PyByteArray_AsString@Base @SVER@ + PyByteArray_Concat@Base @SVER@ + PyByteArray_Fini@Base @SVER@ + PyByteArray_FromObject@Base @SVER@ + PyByteArray_FromStringAndSize@Base @SVER@ + PyByteArray_Init@Base @SVER@ + PyByteArray_Resize@Base @SVER@ + PyByteArray_Size@Base @SVER@ + PyByteArray_Type@Base @SVER@ + PyBytesIO_Type@Base @SVER@ + PyBytesIter_Type@Base @SVER@ + PyBytes_AsString@Base @SVER@ + PyBytes_AsStringAndSize@Base @SVER@ + PyBytes_Concat@Base @SVER@ + PyBytes_ConcatAndDel@Base @SVER@ + PyBytes_DecodeEscape@Base @SVER@ + PyBytes_Fini@Base @SVER@ + PyBytes_FromFormat@Base @SVER@ + PyBytes_FromFormatV@Base @SVER@ + PyBytes_FromObject@Base @SVER@ + PyBytes_FromString@Base @SVER@ + PyBytes_FromStringAndSize@Base @SVER@ + PyBytes_Repr@Base @SVER@ + PyBytes_Size@Base @SVER@ + PyBytes_Type@Base @SVER@ + (optional)PyCArgObject_new@Base @SVER@ + (optional)PyCArg_Type@Base @SVER@ + (optional)PyCArrayType_Type@Base @SVER@ + (optional)PyCArrayType_from_ctype@Base @SVER@ + (optional)PyCArray_Type@Base @SVER@ + (optional)PyCData_AtAddress@Base @SVER@ + (optional)PyCData_FromBaseObj@Base @SVER@ + (optional)PyCData_Type@Base @SVER@ + (optional)PyCData_get@Base @SVER@ + (optional)PyCData_set@Base @SVER@ + (optional)PyCField_FromDesc@Base @SVER@ + (optional)PyCField_Type@Base @SVER@ + (optional)PyCFuncPtrType_Type@Base @SVER@ + (optional)PyCFuncPtr_Type@Base @SVER@ + PyCFunction_Call@Base @SVER@ + PyCFunction_ClearFreeList@Base @SVER@ + PyCFunction_Fini@Base @SVER@ + PyCFunction_GetFlags@Base @SVER@ + PyCFunction_GetFunction@Base @SVER@ + PyCFunction_GetSelf@Base @SVER@ + PyCFunction_New@Base @SVER@ + PyCFunction_NewEx@Base @SVER@ + PyCFunction_Type@Base @SVER@ + (optional)PyCPointerType_Type@Base @SVER@ + (optional)PyCPointer_Type@Base @SVER@ + (optional)PyCSimpleType_Type@Base @SVER@ + (optional)PyCStgDict_Type@Base @SVER@ + (optional)PyCStgDict_clone@Base @SVER@ + (optional)PyCStructType_Type@Base @SVER@ + (optional)PyCStructUnionType_update_stgdict@Base @SVER@ + (optional)PyCThunk_Type@Base @SVER@ + PyCallIter_New@Base @SVER@ + PyCallIter_Type@Base @SVER@ + PyCallable_Check@Base @SVER@ + PyCapsule_GetContext@Base @SVER@ + PyCapsule_GetDestructor@Base @SVER@ + PyCapsule_GetName@Base @SVER@ + PyCapsule_GetPointer@Base @SVER@ + PyCapsule_Import@Base @SVER@ + PyCapsule_IsValid@Base @SVER@ + PyCapsule_New@Base @SVER@ + PyCapsule_SetContext@Base @SVER@ + PyCapsule_SetDestructor@Base @SVER@ + PyCapsule_SetName@Base @SVER@ + PyCapsule_SetPointer@Base @SVER@ + PyCapsule_Type@Base @SVER@ + PyCell_Get@Base @SVER@ + PyCell_New@Base @SVER@ + PyCell_Set@Base @SVER@ + PyCell_Type@Base @SVER@ + PyClassMethodDescr_Type@Base @SVER@ + PyClassMethod_New@Base @SVER@ + PyClassMethod_Type@Base @SVER@ + PyCode_Addr2Line@Base @SVER@ + PyCode_New@Base @SVER@ + PyCode_NewEmpty@Base @SVER@ + PyCode_Optimize@Base @SVER@ + PyCode_Type@Base @SVER@ + PyCodec_BackslashReplaceErrors@Base @SVER@ + PyCodec_Decode@Base @SVER@ + PyCodec_Decoder@Base @SVER@ + PyCodec_Encode@Base @SVER@ + PyCodec_Encoder@Base @SVER@ + PyCodec_IgnoreErrors@Base @SVER@ + PyCodec_IncrementalDecoder@Base @SVER@ + PyCodec_IncrementalEncoder@Base @SVER@ + PyCodec_KnownEncoding@Base @SVER@ + PyCodec_LookupError@Base @SVER@ + PyCodec_Register@Base @SVER@ + PyCodec_RegisterError@Base @SVER@ + PyCodec_ReplaceErrors@Base @SVER@ + PyCodec_StreamReader@Base @SVER@ + PyCodec_StreamWriter@Base @SVER@ + PyCodec_StrictErrors@Base @SVER@ + PyCodec_XMLCharRefReplaceErrors@Base @SVER@ + PyCompileString@Base @SVER@ + PyCompile_OpcodeStackEffect@Base @SVER@ + PyComplex_AsCComplex@Base @SVER@ + PyComplex_FromCComplex@Base @SVER@ + PyComplex_FromDoubles@Base @SVER@ + PyComplex_ImagAsDouble@Base @SVER@ + PyComplex_RealAsDouble@Base @SVER@ + PyComplex_Type@Base @SVER@ + PyDescr_NewClassMethod@Base @SVER@ + PyDescr_NewGetSet@Base @SVER@ + PyDescr_NewMember@Base @SVER@ + PyDescr_NewMethod@Base @SVER@ + PyDescr_NewWrapper@Base @SVER@ + PyDictItems_Type@Base @SVER@ + PyDictIterItem_Type@Base @SVER@ + PyDictIterKey_Type@Base @SVER@ + PyDictIterValue_Type@Base @SVER@ + PyDictKeys_Type@Base @SVER@ + PyDictProxy_New@Base @SVER@ + PyDictProxy_Type@Base @SVER@ + PyDictValues_Type@Base @SVER@ + PyDict_Clear@Base @SVER@ + PyDict_ClearFreeList@Base @SVER@ + PyDict_Contains@Base @SVER@ + PyDict_Copy@Base @SVER@ + PyDict_DelItem@Base @SVER@ + PyDict_DelItemString@Base @SVER@ + PyDict_Fini@Base @SVER@ + PyDict_GetItem@Base @SVER@ + (optional)PyDict_GetItemProxy@Base @SVER@ + PyDict_GetItemString@Base @SVER@ + PyDict_GetItemWithError@Base @SVER@ + PyDict_Items@Base @SVER@ + PyDict_Keys@Base @SVER@ + PyDict_Merge@Base @SVER@ + PyDict_MergeFromSeq2@Base @SVER@ + PyDict_New@Base @SVER@ + PyDict_Next@Base @SVER@ + PyDict_SetDefault@Base @SVER@ + PyDict_SetItem@Base @SVER@ + (optional)PyDict_SetItemProxy@Base @SVER@ + PyDict_SetItemString@Base @SVER@ + PyDict_Size@Base @SVER@ + PyDict_Type@Base @SVER@ + PyDict_Update@Base @SVER@ + PyDict_Values@Base @SVER@ + PyEllipsis_Type@Base @SVER@ + PyEnum_Type@Base @SVER@ + PyErr_BadArgument@Base @SVER@ + PyErr_BadInternalCall@Base @SVER@ + PyErr_CheckSignals@Base @SVER@ + PyErr_Clear@Base @SVER@ + PyErr_Display@Base @SVER@ + PyErr_ExceptionMatches@Base @SVER@ + PyErr_Fetch@Base @SVER@ + PyErr_Format@Base @SVER@ + PyErr_GetExcInfo@Base @SVER@ + PyErr_GivenExceptionMatches@Base @SVER@ + PyErr_NewException@Base @SVER@ + PyErr_NewExceptionWithDoc@Base @SVER@ + PyErr_NoMemory@Base @SVER@ + PyErr_NormalizeException@Base @SVER@ + PyErr_Occurred@Base @SVER@ + PyErr_Print@Base @SVER@ + PyErr_PrintEx@Base @SVER@ + PyErr_ProgramText@Base @SVER@ + PyErr_ProgramTextObject@Base @SVER@ + PyErr_Restore@Base @SVER@ + PyErr_SetExcInfo@Base @SVER@ + PyErr_SetFromErrno@Base @SVER@ + PyErr_SetFromErrnoWithFilename@Base @SVER@ + PyErr_SetFromErrnoWithFilenameObject@Base @SVER@ + PyErr_SetFromErrnoWithFilenameObjects@Base @SVER@ + PyErr_SetImportError@Base @SVER@ + PyErr_SetInterrupt@Base @SVER@ + PyErr_SetNone@Base @SVER@ + PyErr_SetObject@Base @SVER@ + PyErr_SetString@Base @SVER@ + PyErr_SyntaxLocation@Base @SVER@ + PyErr_SyntaxLocationEx@Base @SVER@ + PyErr_SyntaxLocationObject@Base @SVER@ + PyErr_Warn@Base @SVER@ + PyErr_WarnEx@Base @SVER@ + PyErr_WarnExplicit@Base @SVER@ + PyErr_WarnExplicitFormat@Base @SVER@ + PyErr_WarnExplicitObject@Base @SVER@ + PyErr_WarnFormat@Base @SVER@ + PyErr_WriteUnraisable@Base @SVER@ + PyEval_AcquireLock@Base @SVER@ + PyEval_AcquireThread@Base @SVER@ + PyEval_CallFunction@Base @SVER@ + PyEval_CallMethod@Base @SVER@ + PyEval_CallObjectWithKeywords@Base @SVER@ + PyEval_EvalCode@Base @SVER@ + PyEval_EvalCodeEx@Base @SVER@ + PyEval_EvalFrame@Base @SVER@ + PyEval_EvalFrameEx@Base @SVER@ + PyEval_GetBuiltins@Base @SVER@ + PyEval_GetCallStats@Base @SVER@ + PyEval_GetFrame@Base @SVER@ + PyEval_GetFuncDesc@Base @SVER@ + PyEval_GetFuncName@Base @SVER@ + PyEval_GetGlobals@Base @SVER@ + PyEval_GetLocals@Base @SVER@ + PyEval_InitThreads@Base @SVER@ + PyEval_MergeCompilerFlags@Base @SVER@ + PyEval_ReInitThreads@Base @SVER@ + PyEval_ReleaseLock@Base @SVER@ + PyEval_ReleaseThread@Base @SVER@ + PyEval_RestoreThread@Base @SVER@ + PyEval_SaveThread@Base @SVER@ + PyEval_SetProfile@Base @SVER@ + PyEval_SetTrace@Base @SVER@ + PyEval_ThreadsInitialized@Base @SVER@ + (optional)PyExc_ArgError@Base @SVER@ + PyExc_ArithmeticError@Base @SVER@ + PyExc_AssertionError@Base @SVER@ + PyExc_AttributeError@Base @SVER@ + PyExc_BaseException@Base @SVER@ + PyExc_BlockingIOError@Base @SVER@ + PyExc_BrokenPipeError@Base @SVER@ + PyExc_BufferError@Base @SVER@ + PyExc_BytesWarning@Base @SVER@ + PyExc_ChildProcessError@Base @SVER@ + PyExc_ConnectionAbortedError@Base @SVER@ + PyExc_ConnectionError@Base @SVER@ + PyExc_ConnectionRefusedError@Base @SVER@ + PyExc_ConnectionResetError@Base @SVER@ + PyExc_DeprecationWarning@Base @SVER@ + PyExc_EOFError@Base @SVER@ + PyExc_EnvironmentError@Base @SVER@ + PyExc_Exception@Base @SVER@ + PyExc_FileExistsError@Base @SVER@ + PyExc_FileNotFoundError@Base @SVER@ + PyExc_FloatingPointError@Base @SVER@ + PyExc_FutureWarning@Base @SVER@ + PyExc_GeneratorExit@Base @SVER@ + PyExc_IOError@Base @SVER@ + PyExc_ImportError@Base @SVER@ + PyExc_ImportWarning@Base @SVER@ + PyExc_IndentationError@Base @SVER@ + PyExc_IndexError@Base @SVER@ + PyExc_InterruptedError@Base @SVER@ + PyExc_IsADirectoryError@Base @SVER@ + PyExc_KeyError@Base @SVER@ + PyExc_KeyboardInterrupt@Base @SVER@ + PyExc_LookupError@Base @SVER@ + PyExc_MemoryError@Base @SVER@ + PyExc_NameError@Base @SVER@ + PyExc_NotADirectoryError@Base @SVER@ + PyExc_NotImplementedError@Base @SVER@ + PyExc_OSError@Base @SVER@ + PyExc_OverflowError@Base @SVER@ + PyExc_PendingDeprecationWarning@Base @SVER@ + PyExc_PermissionError@Base @SVER@ + PyExc_ProcessLookupError@Base @SVER@ + PyExc_RecursionErrorInst@Base @SVER@ + PyExc_ReferenceError@Base @SVER@ + PyExc_ResourceWarning@Base @SVER@ + PyExc_RuntimeError@Base @SVER@ + PyExc_RuntimeWarning@Base @SVER@ + PyExc_StopIteration@Base @SVER@ + PyExc_SyntaxError@Base @SVER@ + PyExc_SyntaxWarning@Base @SVER@ + PyExc_SystemError@Base @SVER@ + PyExc_SystemExit@Base @SVER@ + PyExc_TabError@Base @SVER@ + PyExc_TimeoutError@Base @SVER@ + PyExc_TypeError@Base @SVER@ + PyExc_UnboundLocalError@Base @SVER@ + PyExc_UnicodeDecodeError@Base @SVER@ + PyExc_UnicodeEncodeError@Base @SVER@ + PyExc_UnicodeError@Base @SVER@ + PyExc_UnicodeTranslateError@Base @SVER@ + PyExc_UnicodeWarning@Base @SVER@ + PyExc_UserWarning@Base @SVER@ + PyExc_ValueError@Base @SVER@ + PyExc_Warning@Base @SVER@ + PyExc_ZeroDivisionError@Base @SVER@ + PyException_GetCause@Base @SVER@ + PyException_GetContext@Base @SVER@ + PyException_GetTraceback@Base @SVER@ + PyException_SetCause@Base @SVER@ + PyException_SetContext@Base @SVER@ + PyException_SetTraceback@Base @SVER@ + PyFPE_counter@Base @SVER@ + PyFPE_dummy@Base @SVER@ + PyFPE_jbuf@Base @SVER@ + PyFileIO_Type@Base @SVER@ + PyFile_FromFd@Base @SVER@ + PyFile_GetLine@Base @SVER@ + PyFile_NewStdPrinter@Base @SVER@ + PyFile_WriteObject@Base @SVER@ + PyFile_WriteString@Base @SVER@ + PyFilter_Type@Base @SVER@ + PyFloat_AsDouble@Base @SVER@ + PyFloat_ClearFreeList@Base @SVER@ + PyFloat_Fini@Base @SVER@ + PyFloat_FromDouble@Base @SVER@ + PyFloat_FromString@Base @SVER@ + PyFloat_GetInfo@Base @SVER@ + PyFloat_GetMax@Base @SVER@ + PyFloat_GetMin@Base @SVER@ + PyFloat_Type@Base @SVER@ + PyFrame_BlockPop@Base @SVER@ + PyFrame_BlockSetup@Base @SVER@ + PyFrame_ClearFreeList@Base @SVER@ + PyFrame_FastToLocals@Base @SVER@ + PyFrame_FastToLocalsWithError@Base @SVER@ + PyFrame_Fini@Base @SVER@ + PyFrame_GetLineNumber@Base @SVER@ + PyFrame_LocalsToFast@Base @SVER@ + PyFrame_New@Base @SVER@ + PyFrame_Type@Base @SVER@ + PyFrozenSet_New@Base @SVER@ + PyFrozenSet_Type@Base @SVER@ + PyFunction_GetAnnotations@Base @SVER@ + PyFunction_GetClosure@Base @SVER@ + PyFunction_GetCode@Base @SVER@ + PyFunction_GetDefaults@Base @SVER@ + PyFunction_GetGlobals@Base @SVER@ + PyFunction_GetKwDefaults@Base @SVER@ + PyFunction_GetModule@Base @SVER@ + PyFunction_New@Base @SVER@ + PyFunction_NewWithQualName@Base @SVER@ + PyFunction_SetAnnotations@Base @SVER@ + PyFunction_SetClosure@Base @SVER@ + PyFunction_SetDefaults@Base @SVER@ + PyFunction_SetKwDefaults@Base @SVER@ + PyFunction_Type@Base @SVER@ + PyFuture_FromAST@Base @SVER@ + PyFuture_FromASTObject@Base @SVER@ + PyGC_Collect@Base @SVER@ + PyGILState_Check@Base @SVER@ + PyGILState_Ensure@Base @SVER@ + PyGILState_GetThisThreadState@Base @SVER@ + PyGILState_Release@Base @SVER@ + PyGen_NeedsFinalizing@Base @SVER@ + PyGen_New@Base @SVER@ + PyGen_Type@Base @SVER@ + PyGetSetDescr_Type@Base @SVER@ + PyGrammar_AddAccelerators@Base @SVER@ + PyGrammar_FindDFA@Base @SVER@ + PyGrammar_LabelRepr@Base @SVER@ + PyGrammar_RemoveAccelerators@Base @SVER@ + PyHash_GetFuncDef@Base @SVER@ + PyIOBase_Type@Base @SVER@ + PyImport_AddModule@Base @SVER@ + PyImport_AddModuleObject@Base @SVER@ + PyImport_AppendInittab@Base @SVER@ + PyImport_Cleanup@Base @SVER@ + PyImport_ExecCodeModule@Base @SVER@ + PyImport_ExecCodeModuleEx@Base @SVER@ + PyImport_ExecCodeModuleObject@Base @SVER@ + PyImport_ExecCodeModuleWithPathnames@Base @SVER@ + PyImport_ExtendInittab@Base @SVER@ + PyImport_FrozenModules@Base @SVER@ + PyImport_GetImporter@Base @SVER@ + PyImport_GetMagicNumber@Base @SVER@ + PyImport_GetMagicTag@Base @SVER@ + PyImport_GetModuleDict@Base @SVER@ + PyImport_Import@Base @SVER@ + PyImport_ImportFrozenModule@Base @SVER@ + PyImport_ImportFrozenModuleObject@Base @SVER@ + PyImport_ImportModule@Base @SVER@ + PyImport_ImportModuleLevel@Base @SVER@ + PyImport_ImportModuleLevelObject@Base @SVER@ + PyImport_ImportModuleNoBlock@Base @SVER@ + PyImport_Inittab@Base @SVER@ + PyImport_ReloadModule@Base @SVER@ + PyIncrementalNewlineDecoder_Type@Base @SVER@ + PyInstanceMethod_Function@Base @SVER@ + PyInstanceMethod_New@Base @SVER@ + PyInstanceMethod_Type@Base @SVER@ + PyInterpreterState_Clear@Base @SVER@ + PyInterpreterState_Delete@Base @SVER@ + PyInterpreterState_Head@Base @SVER@ + PyInterpreterState_New@Base @SVER@ + PyInterpreterState_Next@Base @SVER@ + PyInterpreterState_ThreadHead@Base @SVER@ + PyIter_Next@Base @SVER@ + PyListIter_Type@Base @SVER@ + PyListRevIter_Type@Base @SVER@ + PyList_Append@Base @SVER@ + PyList_AsTuple@Base @SVER@ + PyList_ClearFreeList@Base @SVER@ + PyList_Fini@Base @SVER@ + PyList_GetItem@Base @SVER@ + PyList_GetSlice@Base @SVER@ + PyList_Insert@Base @SVER@ + PyList_New@Base @SVER@ + PyList_Reverse@Base @SVER@ + PyList_SetItem@Base @SVER@ + PyList_SetSlice@Base @SVER@ + PyList_Size@Base @SVER@ + PyList_Sort@Base @SVER@ + PyList_Type@Base @SVER@ + PyLongRangeIter_Type@Base @SVER@ + PyLong_AsDouble@Base @SVER@ + PyLong_AsLong@Base @SVER@ + PyLong_AsLongAndOverflow@Base @SVER@ + PyLong_AsLongLong@Base @SVER@ + PyLong_AsLongLongAndOverflow@Base @SVER@ + PyLong_AsSize_t@Base @SVER@ + PyLong_AsSsize_t@Base @SVER@ + PyLong_AsUnsignedLong@Base @SVER@ + PyLong_AsUnsignedLongLong@Base @SVER@ + PyLong_AsUnsignedLongLongMask@Base @SVER@ + PyLong_AsUnsignedLongMask@Base @SVER@ + PyLong_AsVoidPtr@Base @SVER@ + PyLong_Fini@Base @SVER@ + PyLong_FromDouble@Base @SVER@ + PyLong_FromLong@Base @SVER@ + PyLong_FromLongLong@Base @SVER@ + PyLong_FromSize_t@Base @SVER@ + PyLong_FromSsize_t@Base @SVER@ + PyLong_FromString@Base @SVER@ + PyLong_FromUnicode@Base @SVER@ + PyLong_FromUnicodeObject@Base @SVER@ + PyLong_FromUnsignedLong@Base @SVER@ + PyLong_FromUnsignedLongLong@Base @SVER@ + PyLong_FromVoidPtr@Base @SVER@ + PyLong_GetInfo@Base @SVER@ + PyLong_Type@Base @SVER@ + PyMap_Type@Base @SVER@ + PyMapping_Check@Base @SVER@ + PyMapping_GetItemString@Base @SVER@ + PyMapping_HasKey@Base @SVER@ + PyMapping_HasKeyString@Base @SVER@ + PyMapping_Items@Base @SVER@ + PyMapping_Keys@Base @SVER@ + PyMapping_Length@Base @SVER@ + PyMapping_SetItemString@Base @SVER@ + PyMapping_Size@Base @SVER@ + PyMapping_Values@Base @SVER@ + PyMarshal_Init@Base @SVER@ + PyMarshal_ReadLastObjectFromFile@Base @SVER@ + PyMarshal_ReadLongFromFile@Base @SVER@ + PyMarshal_ReadObjectFromFile@Base @SVER@ + PyMarshal_ReadObjectFromString@Base @SVER@ + PyMarshal_ReadShortFromFile@Base @SVER@ + PyMarshal_WriteLongToFile@Base @SVER@ + PyMarshal_WriteObjectToFile@Base @SVER@ + PyMarshal_WriteObjectToString@Base @SVER@ + PyMem_Free@Base @SVER@ + PyMem_GetAllocator@Base @SVER@ + PyMem_Malloc@Base @SVER@ + PyMem_RawFree@Base @SVER@ + PyMem_RawMalloc@Base @SVER@ + PyMem_RawRealloc@Base @SVER@ + PyMem_Realloc@Base @SVER@ + PyMem_SetAllocator@Base @SVER@ + PyMem_SetupDebugHooks@Base @SVER@ + PyMemberDescr_Type@Base @SVER@ + PyMember_GetOne@Base @SVER@ + PyMember_SetOne@Base @SVER@ + PyMemoryView_FromBuffer@Base @SVER@ + PyMemoryView_FromMemory@Base @SVER@ + PyMemoryView_FromObject@Base @SVER@ + PyMemoryView_GetContiguous@Base @SVER@ + PyMemoryView_Type@Base @SVER@ + PyMethodDescr_Type@Base @SVER@ + PyMethod_ClearFreeList@Base @SVER@ + PyMethod_Fini@Base @SVER@ + PyMethod_Function@Base @SVER@ + PyMethod_New@Base @SVER@ + PyMethod_Self@Base @SVER@ + PyMethod_Type@Base @SVER@ + PyModule_AddIntConstant@Base @SVER@ + PyModule_AddObject@Base @SVER@ + PyModule_AddStringConstant@Base @SVER@ + PyModule_GetDef@Base @SVER@ + PyModule_GetDict@Base @SVER@ + PyModule_GetFilename@Base @SVER@ + PyModule_GetFilenameObject@Base @SVER@ + PyModule_GetName@Base @SVER@ + PyModule_GetNameObject@Base @SVER@ + PyModule_GetState@Base @SVER@ + PyModule_GetWarningsModule@Base @SVER@ + PyModule_New@Base @SVER@ + PyModule_NewObject@Base @SVER@ + PyModule_Type@Base @SVER@ + PyNode_AddChild@Base @SVER@ + PyNode_Compile@Base @SVER@ + PyNode_Free@Base @SVER@ + PyNode_ListTree@Base @SVER@ + PyNode_New@Base @SVER@ + PyNumber_Absolute@Base @SVER@ + PyNumber_Add@Base @SVER@ + PyNumber_And@Base @SVER@ + PyNumber_AsOff_t@Base @SVER@ + PyNumber_AsSsize_t@Base @SVER@ + PyNumber_Check@Base @SVER@ + PyNumber_Divmod@Base @SVER@ + PyNumber_Float@Base @SVER@ + PyNumber_FloorDivide@Base @SVER@ + PyNumber_InPlaceAdd@Base @SVER@ + PyNumber_InPlaceAnd@Base @SVER@ + PyNumber_InPlaceFloorDivide@Base @SVER@ + PyNumber_InPlaceLshift@Base @SVER@ + PyNumber_InPlaceMultiply@Base @SVER@ + PyNumber_InPlaceOr@Base @SVER@ + PyNumber_InPlacePower@Base @SVER@ + PyNumber_InPlaceRemainder@Base @SVER@ + PyNumber_InPlaceRshift@Base @SVER@ + PyNumber_InPlaceSubtract@Base @SVER@ + PyNumber_InPlaceTrueDivide@Base @SVER@ + PyNumber_InPlaceXor@Base @SVER@ + PyNumber_Index@Base @SVER@ + PyNumber_Invert@Base @SVER@ + PyNumber_Long@Base @SVER@ + PyNumber_Lshift@Base @SVER@ + PyNumber_Multiply@Base @SVER@ + PyNumber_Negative@Base @SVER@ + PyNumber_Or@Base @SVER@ + PyNumber_Positive@Base @SVER@ + PyNumber_Power@Base @SVER@ + PyNumber_Remainder@Base @SVER@ + PyNumber_Rshift@Base @SVER@ + PyNumber_Subtract@Base @SVER@ + PyNumber_ToBase@Base @SVER@ + PyNumber_TrueDivide@Base @SVER@ + PyNumber_Xor@Base @SVER@ + PyOS_AfterFork@Base @SVER@ + PyOS_FiniInterrupts@Base @SVER@ + PyOS_InitInterrupts@Base @SVER@ + PyOS_InputHook@Base @SVER@ + PyOS_InterruptOccurred@Base @SVER@ + PyOS_Readline@Base @SVER@ + PyOS_ReadlineFunctionPointer@Base @SVER@ + PyOS_StdioReadline@Base @SVER@ + PyOS_double_to_string@Base @SVER@ + PyOS_getsig@Base @SVER@ + PyOS_mystricmp@Base @SVER@ + PyOS_mystrnicmp@Base @SVER@ + PyOS_setsig@Base @SVER@ + PyOS_snprintf@Base @SVER@ + PyOS_string_to_double@Base @SVER@ + PyOS_strtol@Base @SVER@ + PyOS_strtoul@Base @SVER@ + PyOS_vsnprintf@Base @SVER@ + PyObject_ASCII@Base @SVER@ + PyObject_AsCharBuffer@Base @SVER@ + PyObject_AsFileDescriptor@Base @SVER@ + PyObject_AsReadBuffer@Base @SVER@ + PyObject_AsWriteBuffer@Base @SVER@ + PyObject_Bytes@Base @SVER@ + PyObject_Call@Base @SVER@ + PyObject_CallFinalizer@Base @SVER@ + PyObject_CallFinalizerFromDealloc@Base @SVER@ + PyObject_CallFunction@Base @SVER@ + PyObject_CallFunctionObjArgs@Base @SVER@ + PyObject_CallMethod@Base @SVER@ + PyObject_CallMethodObjArgs@Base @SVER@ + PyObject_CallObject@Base @SVER@ + PyObject_CheckReadBuffer@Base @SVER@ + PyObject_ClearWeakRefs@Base @SVER@ + PyObject_CopyData@Base @SVER@ + PyObject_DelItem@Base @SVER@ + PyObject_DelItemString@Base @SVER@ + PyObject_Dir@Base @SVER@ + PyObject_Format@Base @SVER@ + PyObject_Free@Base @SVER@ + PyObject_GC_Del@Base @SVER@ + PyObject_GC_Track@Base @SVER@ + PyObject_GC_UnTrack@Base @SVER@ + PyObject_GenericGetAttr@Base @SVER@ + PyObject_GenericGetDict@Base @SVER@ + PyObject_GenericSetAttr@Base @SVER@ + PyObject_GenericSetDict@Base @SVER@ + PyObject_GetArenaAllocator@Base @SVER@ + PyObject_GetAttr@Base @SVER@ + PyObject_GetAttrString@Base @SVER@ + PyObject_GetBuffer@Base @SVER@ + PyObject_GetItem@Base @SVER@ + PyObject_GetIter@Base @SVER@ + PyObject_HasAttr@Base @SVER@ + PyObject_HasAttrString@Base @SVER@ + PyObject_Hash@Base @SVER@ + PyObject_HashNotImplemented@Base @SVER@ + PyObject_Init@Base @SVER@ + PyObject_InitVar@Base @SVER@ + PyObject_IsInstance@Base @SVER@ + PyObject_IsSubclass@Base @SVER@ + PyObject_IsTrue@Base @SVER@ + PyObject_Length@Base @SVER@ + PyObject_LengthHint@Base @SVER@ + PyObject_Malloc@Base @SVER@ + PyObject_Not@Base @SVER@ + PyObject_Print@Base @SVER@ + PyObject_Realloc@Base @SVER@ + PyObject_Repr@Base @SVER@ + PyObject_RichCompare@Base @SVER@ + PyObject_RichCompareBool@Base @SVER@ + PyObject_SelfIter@Base @SVER@ + PyObject_SetArenaAllocator@Base @SVER@ + PyObject_SetAttr@Base @SVER@ + PyObject_SetAttrString@Base @SVER@ + PyObject_SetItem@Base @SVER@ + PyObject_Size@Base @SVER@ + PyObject_Str@Base @SVER@ + PyObject_Type@Base @SVER@ + (optional)PyObject_stgdict@Base @SVER@ + PyParser_ASTFromFile@Base @SVER@ + PyParser_ASTFromFileObject@Base @SVER@ + PyParser_ASTFromString@Base @SVER@ + PyParser_ASTFromStringObject@Base @SVER@ + PyParser_AddToken@Base @SVER@ + PyParser_ClearError@Base @SVER@ + PyParser_Delete@Base @SVER@ + PyParser_New@Base @SVER@ + PyParser_ParseFile@Base @SVER@ + PyParser_ParseFileFlags@Base @SVER@ + PyParser_ParseFileFlagsEx@Base @SVER@ + PyParser_ParseFileObject@Base @SVER@ + PyParser_ParseString@Base @SVER@ + PyParser_ParseStringFlags@Base @SVER@ + PyParser_ParseStringFlagsFilename@Base @SVER@ + PyParser_ParseStringFlagsFilenameEx@Base @SVER@ + PyParser_ParseStringObject@Base @SVER@ + PyParser_SetError@Base @SVER@ + PyParser_SimpleParseFile@Base @SVER@ + PyParser_SimpleParseFileFlags@Base @SVER@ + PyParser_SimpleParseString@Base @SVER@ + PyParser_SimpleParseStringFilename@Base @SVER@ + PyParser_SimpleParseStringFlags@Base @SVER@ + PyParser_SimpleParseStringFlagsFilename@Base @SVER@ + PyProperty_Type@Base @SVER@ + PyRangeIter_Type@Base @SVER@ + PyRange_Type@Base @SVER@ + PyRawIOBase_Type@Base @SVER@ + PyReversed_Type@Base @SVER@ + PyRun_AnyFile@Base @SVER@ + PyRun_AnyFileEx@Base @SVER@ + PyRun_AnyFileExFlags@Base @SVER@ + PyRun_AnyFileFlags@Base @SVER@ + PyRun_File@Base @SVER@ + PyRun_FileEx@Base @SVER@ + PyRun_FileExFlags@Base @SVER@ + PyRun_FileFlags@Base @SVER@ + PyRun_InteractiveLoop@Base @SVER@ + PyRun_InteractiveLoopFlags@Base @SVER@ + PyRun_InteractiveOne@Base @SVER@ + PyRun_InteractiveOneFlags@Base @SVER@ + PyRun_InteractiveOneObject@Base @SVER@ + PyRun_SimpleFile@Base @SVER@ + PyRun_SimpleFileEx@Base @SVER@ + PyRun_SimpleFileExFlags@Base @SVER@ + PyRun_SimpleString@Base @SVER@ + PyRun_SimpleStringFlags@Base @SVER@ + PyRun_String@Base @SVER@ + PyRun_StringFlags@Base @SVER@ + PySTEntry_Type@Base @SVER@ + PyST_GetScope@Base @SVER@ + PySeqIter_New@Base @SVER@ + PySeqIter_Type@Base @SVER@ + PySequence_Check@Base @SVER@ + PySequence_Concat@Base @SVER@ + PySequence_Contains@Base @SVER@ + PySequence_Count@Base @SVER@ + PySequence_DelItem@Base @SVER@ + PySequence_DelSlice@Base @SVER@ + PySequence_Fast@Base @SVER@ + PySequence_GetItem@Base @SVER@ + PySequence_GetSlice@Base @SVER@ + PySequence_In@Base @SVER@ + PySequence_InPlaceConcat@Base @SVER@ + PySequence_InPlaceRepeat@Base @SVER@ + PySequence_Index@Base @SVER@ + PySequence_Length@Base @SVER@ + PySequence_List@Base @SVER@ + PySequence_Repeat@Base @SVER@ + PySequence_SetItem@Base @SVER@ + PySequence_SetSlice@Base @SVER@ + PySequence_Size@Base @SVER@ + PySequence_Tuple@Base @SVER@ + PySetIter_Type@Base @SVER@ + PySet_Add@Base @SVER@ + PySet_Clear@Base @SVER@ + PySet_ClearFreeList@Base @SVER@ + PySet_Contains@Base @SVER@ + PySet_Discard@Base @SVER@ + PySet_Fini@Base @SVER@ + PySet_New@Base @SVER@ + PySet_Pop@Base @SVER@ + PySet_Size@Base @SVER@ + PySet_Type@Base @SVER@ + PySignal_SetWakeupFd@Base @SVER@ + PySlice_Fini@Base @SVER@ + PySlice_GetIndices@Base @SVER@ + PySlice_GetIndicesEx@Base @SVER@ + PySlice_New@Base @SVER@ + PySlice_Type@Base @SVER@ + PyState_AddModule@Base @SVER@ + PyState_FindModule@Base @SVER@ + PyState_RemoveModule@Base @SVER@ + PyStaticMethod_New@Base @SVER@ + PyStaticMethod_Type@Base @SVER@ + PyStdPrinter_Type@Base @SVER@ + PyStringIO_Type@Base @SVER@ + PyStructSequence_GetItem@Base @SVER@ + PyStructSequence_InitType2@Base @SVER@ + PyStructSequence_InitType@Base @SVER@ + PyStructSequence_New@Base @SVER@ + PyStructSequence_NewType@Base @SVER@ + PyStructSequence_SetItem@Base @SVER@ + PyStructSequence_UnnamedField@Base @SVER@ + PySuper_Type@Base @SVER@ + PySymtable_Build@Base @SVER@ + PySymtable_BuildObject@Base @SVER@ + PySymtable_Free@Base @SVER@ + PySymtable_Lookup@Base @SVER@ + PySys_AddWarnOption@Base @SVER@ + PySys_AddWarnOptionUnicode@Base @SVER@ + PySys_AddXOption@Base @SVER@ + PySys_FormatStderr@Base @SVER@ + PySys_FormatStdout@Base @SVER@ + PySys_GetObject@Base @SVER@ + PySys_GetXOptions@Base @SVER@ + PySys_HasWarnOptions@Base @SVER@ + PySys_ResetWarnOptions@Base @SVER@ + PySys_SetArgv@Base @SVER@ + PySys_SetArgvEx@Base @SVER@ + PySys_SetObject@Base @SVER@ + PySys_SetPath@Base @SVER@ + PySys_WriteStderr@Base @SVER@ + PySys_WriteStdout@Base @SVER@ + PyTextIOBase_Type@Base @SVER@ + PyTextIOWrapper_Type@Base @SVER@ + PyThreadState_Clear@Base @SVER@ + PyThreadState_Delete@Base @SVER@ + PyThreadState_DeleteCurrent@Base @SVER@ + PyThreadState_Get@Base @SVER@ + PyThreadState_GetDict@Base @SVER@ + PyThreadState_New@Base @SVER@ + PyThreadState_Next@Base @SVER@ + PyThreadState_SetAsyncExc@Base @SVER@ + PyThreadState_Swap@Base @SVER@ + PyThread_GetInfo@Base @SVER@ + PyThread_ReInitTLS@Base @SVER@ + PyThread_acquire_lock@Base @SVER@ + PyThread_acquire_lock_timed@Base @SVER@ + PyThread_allocate_lock@Base @SVER@ + PyThread_create_key@Base @SVER@ + PyThread_delete_key@Base @SVER@ + PyThread_delete_key_value@Base @SVER@ + PyThread_exit_thread@Base @SVER@ + PyThread_free_lock@Base @SVER@ + PyThread_get_key_value@Base @SVER@ + PyThread_get_stacksize@Base @SVER@ + PyThread_get_thread_ident@Base @SVER@ + PyThread_init_thread@Base @SVER@ + PyThread_release_lock@Base @SVER@ + PyThread_set_key_value@Base @SVER@ + PyThread_set_stacksize@Base @SVER@ + PyThread_start_new_thread@Base @SVER@ + PyToken_OneChar@Base @SVER@ + PyToken_ThreeChars@Base @SVER@ + PyToken_TwoChars@Base @SVER@ + PyTokenizer_FindEncoding@Base @SVER@ + PyTokenizer_FindEncodingFilename@Base @SVER@ + PyTokenizer_Free@Base @SVER@ + PyTokenizer_FromFile@Base @SVER@ + PyTokenizer_FromString@Base @SVER@ + PyTokenizer_FromUTF8@Base @SVER@ + PyTokenizer_Get@Base @SVER@ + PyTraceBack_Here@Base @SVER@ + PyTraceBack_Print@Base @SVER@ + PyTraceBack_Type@Base @SVER@ + PyTupleIter_Type@Base @SVER@ + PyTuple_ClearFreeList@Base @SVER@ + PyTuple_Fini@Base @SVER@ + PyTuple_GetItem@Base @SVER@ + PyTuple_GetSlice@Base @SVER@ + PyTuple_New@Base @SVER@ + PyTuple_Pack@Base @SVER@ + PyTuple_SetItem@Base @SVER@ + PyTuple_Size@Base @SVER@ + PyTuple_Type@Base @SVER@ + PyType_ClearCache@Base @SVER@ + PyType_FromSpec@Base @SVER@ + PyType_FromSpecWithBases@Base @SVER@ + PyType_GenericAlloc@Base @SVER@ + PyType_GenericNew@Base @SVER@ + PyType_GetFlags@Base @SVER@ + PyType_GetSlot@Base @SVER@ + PyType_IsSubtype@Base @SVER@ + PyType_Modified@Base @SVER@ + PyType_Ready@Base @SVER@ + PyType_Type@Base @SVER@ + (optional)PyType_stgdict@Base @SVER@ + PyUnicodeDecodeError_Create@Base @SVER@ + PyUnicodeDecodeError_GetEncoding@Base @SVER@ + PyUnicodeDecodeError_GetEnd@Base @SVER@ + PyUnicodeDecodeError_GetObject@Base @SVER@ + PyUnicodeDecodeError_GetReason@Base @SVER@ + PyUnicodeDecodeError_GetStart@Base @SVER@ + PyUnicodeDecodeError_SetEnd@Base @SVER@ + PyUnicodeDecodeError_SetReason@Base @SVER@ + PyUnicodeDecodeError_SetStart@Base @SVER@ + PyUnicodeEncodeError_Create@Base @SVER@ + PyUnicodeEncodeError_GetEncoding@Base @SVER@ + PyUnicodeEncodeError_GetEnd@Base @SVER@ + PyUnicodeEncodeError_GetObject@Base @SVER@ + PyUnicodeEncodeError_GetReason@Base @SVER@ + PyUnicodeEncodeError_GetStart@Base @SVER@ + PyUnicodeEncodeError_SetEnd@Base @SVER@ + PyUnicodeEncodeError_SetReason@Base @SVER@ + PyUnicodeEncodeError_SetStart@Base @SVER@ + PyUnicodeIter_Type@Base @SVER@ + PyUnicodeTranslateError_Create@Base @SVER@ + PyUnicodeTranslateError_GetEnd@Base @SVER@ + PyUnicodeTranslateError_GetObject@Base @SVER@ + PyUnicodeTranslateError_GetReason@Base @SVER@ + PyUnicodeTranslateError_GetStart@Base @SVER@ + PyUnicodeTranslateError_SetEnd@Base @SVER@ + PyUnicodeTranslateError_SetReason@Base @SVER@ + PyUnicodeTranslateError_SetStart@Base @SVER@ + PyUnicode_Append@Base @SVER@ + PyUnicode_AppendAndDel@Base @SVER@ + PyUnicode_AsASCIIString@Base @SVER@ + PyUnicode_AsCharmapString@Base @SVER@ + PyUnicode_AsDecodedObject@Base @SVER@ + PyUnicode_AsDecodedUnicode@Base @SVER@ + PyUnicode_AsEncodedObject@Base @SVER@ + PyUnicode_AsEncodedString@Base @SVER@ + PyUnicode_AsEncodedUnicode@Base @SVER@ + PyUnicode_AsLatin1String@Base @SVER@ + PyUnicode_AsRawUnicodeEscapeString@Base @SVER@ + PyUnicode_AsUCS4@Base @SVER@ + PyUnicode_AsUCS4Copy@Base @SVER@ + PyUnicode_AsUTF16String@Base @SVER@ + PyUnicode_AsUTF32String@Base @SVER@ + PyUnicode_AsUTF8@Base @SVER@ + PyUnicode_AsUTF8AndSize@Base @SVER@ + PyUnicode_AsUTF8String@Base @SVER@ + PyUnicode_AsUnicode@Base @SVER@ + PyUnicode_AsUnicodeAndSize@Base @SVER@ + PyUnicode_AsUnicodeCopy@Base @SVER@ + PyUnicode_AsUnicodeEscapeString@Base @SVER@ + PyUnicode_AsWideChar@Base @SVER@ + PyUnicode_AsWideCharString@Base @SVER@ + PyUnicode_BuildEncodingMap@Base @SVER@ + PyUnicode_ClearFreeList@Base @SVER@ + PyUnicode_Compare@Base @SVER@ + PyUnicode_CompareWithASCIIString@Base @SVER@ + PyUnicode_Concat@Base @SVER@ + PyUnicode_Contains@Base @SVER@ + PyUnicode_CopyCharacters@Base @SVER@ + PyUnicode_Count@Base @SVER@ + PyUnicode_Decode@Base @SVER@ + PyUnicode_DecodeASCII@Base @SVER@ + PyUnicode_DecodeCharmap@Base @SVER@ + PyUnicode_DecodeFSDefault@Base @SVER@ + PyUnicode_DecodeFSDefaultAndSize@Base @SVER@ + PyUnicode_DecodeLatin1@Base @SVER@ + PyUnicode_DecodeLocale@Base @SVER@ + PyUnicode_DecodeLocaleAndSize@Base @SVER@ + PyUnicode_DecodeRawUnicodeEscape@Base @SVER@ + PyUnicode_DecodeUTF16@Base @SVER@ + PyUnicode_DecodeUTF16Stateful@Base @SVER@ + PyUnicode_DecodeUTF32@Base @SVER@ + PyUnicode_DecodeUTF32Stateful@Base @SVER@ + PyUnicode_DecodeUTF7@Base @SVER@ + PyUnicode_DecodeUTF7Stateful@Base @SVER@ + PyUnicode_DecodeUTF8@Base @SVER@ + PyUnicode_DecodeUTF8Stateful@Base @SVER@ + PyUnicode_DecodeUnicodeEscape@Base @SVER@ + PyUnicode_Encode@Base @SVER@ + PyUnicode_EncodeASCII@Base @SVER@ + PyUnicode_EncodeCharmap@Base @SVER@ + PyUnicode_EncodeDecimal@Base @SVER@ + PyUnicode_EncodeFSDefault@Base @SVER@ + PyUnicode_EncodeUTF8@Base @SVER@ + PyUnicode_EncodeUnicodeEscape@Base @SVER@ + PyUnicode_FSConverter@Base @SVER@ + PyUnicode_FSDecoder@Base @SVER@ + PyUnicode_Fill@Base @SVER@ + PyUnicode_Find@Base @SVER@ + PyUnicode_FindChar@Base @SVER@ + PyUnicode_Format@Base @SVER@ + PyUnicode_FromEncodedObject@Base @SVER@ + PyUnicode_FromFormat@Base @SVER@ + PyUnicode_FromFormatV@Base @SVER@ + PyUnicode_FromKindAndData@Base @SVER@ + PyUnicode_FromObject@Base @SVER@ + PyUnicode_FromOrdinal@Base @SVER@ + PyUnicode_FromString@Base @SVER@ + PyUnicode_FromStringAndSize@Base @SVER@ + PyUnicode_FromUnicode@Base @SVER@ + PyUnicode_FromWideChar@Base @SVER@ + PyUnicode_GetDefaultEncoding@Base @SVER@ + PyUnicode_GetLength@Base @SVER@ + PyUnicode_GetMax@Base @SVER@ + PyUnicode_GetSize@Base @SVER@ + PyUnicode_EncodeLatin1@Base @SVER@ + PyUnicode_EncodeLocale@Base @SVER@ + PyUnicode_EncodeRawUnicodeEscape@Base @SVER@ + PyUnicode_EncodeUTF16@Base @SVER@ + PyUnicode_EncodeUTF32@Base @SVER@ + PyUnicode_EncodeUTF7@Base @SVER@ + PyUnicode_InternFromString@Base @SVER@ + PyUnicode_InternImmortal@Base @SVER@ + PyUnicode_InternInPlace@Base @SVER@ + PyUnicode_IsIdentifier@Base @SVER@ + PyUnicode_Join@Base @SVER@ + PyUnicode_New@Base @SVER@ + PyUnicode_Partition@Base @SVER@ + PyUnicode_RPartition@Base @SVER@ + PyUnicode_RSplit@Base @SVER@ + PyUnicode_ReadChar@Base @SVER@ + PyUnicode_Replace@Base @SVER@ + PyUnicode_Resize@Base @SVER@ + PyUnicode_RichCompare@Base @SVER@ + PyUnicode_Split@Base @SVER@ + PyUnicode_Splitlines@Base @SVER@ + PyUnicode_Substring@Base @SVER@ + PyUnicode_Tailmatch@Base @SVER@ + PyUnicode_TransformDecimalToASCII@Base @SVER@ + PyUnicode_Translate@Base @SVER@ + PyUnicode_TranslateCharmap@Base @SVER@ + PyUnicode_Type@Base @SVER@ + PyUnicode_WriteChar@Base @SVER@ + PyWeakref_GetObject@Base @SVER@ + PyWeakref_NewProxy@Base @SVER@ + PyWeakref_NewRef@Base @SVER@ + PyWrapperDescr_Type@Base @SVER@ + PyWrapper_New@Base @SVER@ + PyZip_Type@Base @SVER@ + Py_AddPendingCall@Base @SVER@ + Py_AtExit@Base @SVER@ + Py_BuildValue@Base @SVER@ + Py_BytesWarningFlag@Base @SVER@ + Py_CompileString@Base @SVER@ + Py_CompileStringExFlags@Base @SVER@ + Py_CompileStringFlags@Base @SVER@ + Py_CompileStringObject@Base @SVER@ + Py_DebugFlag@Base @SVER@ + Py_DecRef@Base @SVER@ + Py_DontWriteBytecodeFlag@Base @SVER@ + Py_EndInterpreter@Base @SVER@ + Py_Exit@Base @SVER@ + Py_FatalError@Base @SVER@ + Py_FdIsInteractive@Base @SVER@ + Py_FileSystemDefaultEncoding@Base @SVER@ + Py_Finalize@Base @SVER@ + Py_FrozenFlag@Base @SVER@ + Py_FrozenMain@Base @SVER@ + Py_GetArgcArgv@Base @SVER@ + Py_GetBuildInfo@Base @SVER@ + Py_GetCompiler@Base @SVER@ + Py_GetCopyright@Base @SVER@ + Py_GetExecPrefix@Base @SVER@ + Py_GetPath@Base @SVER@ + Py_GetPlatform@Base @SVER@ + Py_GetPrefix@Base @SVER@ + Py_GetProgramFullPath@Base @SVER@ + Py_GetProgramName@Base @SVER@ + Py_GetPythonHome@Base @SVER@ + Py_GetRecursionLimit@Base @SVER@ + Py_GetVersion@Base @SVER@ + Py_HasFileSystemDefaultEncoding@Base @SVER@ + Py_HashRandomizationFlag@Base @SVER@ + Py_IgnoreEnvironmentFlag@Base @SVER@ + Py_IncRef@Base @SVER@ + Py_Initialize@Base @SVER@ + Py_InitializeEx@Base @SVER@ + Py_InspectFlag@Base @SVER@ + Py_InteractiveFlag@Base @SVER@ + Py_IsInitialized@Base @SVER@ + Py_IsolatedFlag@Base @SVER@ + Py_Main@Base @SVER@ + Py_MakePendingCalls@Base @SVER@ + Py_NewInterpreter@Base @SVER@ + Py_NoSiteFlag@Base @SVER@ + Py_NoUserSiteDirectory@Base @SVER@ + Py_OptimizeFlag@Base @SVER@ + Py_QuietFlag@Base @SVER@ + Py_ReprEnter@Base @SVER@ + Py_ReprLeave@Base @SVER@ + Py_SetPath@Base @SVER@ + Py_SetProgramName@Base @SVER@ + Py_SetPythonHome@Base @SVER@ + Py_SetRecursionLimit@Base @SVER@ + Py_SetStandardStreamEncoding@Base @SVER@ + Py_SymtableString@Base @SVER@ + Py_SymtableStringObject@Base @SVER@ + Py_UNICODE_strcat@Base @SVER@ + Py_UNICODE_strchr@Base @SVER@ + Py_UNICODE_strcmp@Base @SVER@ + Py_UNICODE_strcpy@Base @SVER@ + Py_UNICODE_strlen@Base @SVER@ + Py_UNICODE_strncmp@Base @SVER@ + Py_UNICODE_strncpy@Base @SVER@ + Py_UNICODE_strrchr@Base @SVER@ + Py_UnbufferedStdioFlag@Base @SVER@ + Py_UniversalNewlineFgets@Base @SVER@ + Py_UseClassExceptionsFlag@Base @SVER@ + Py_VaBuildValue@Base @SVER@ + Py_VerboseFlag@Base @SVER@ + Py_hexdigits@Base @SVER@ + Py_meta_grammar@Base @SVER@ + Py_pgen@Base @SVER@ + _PyAccu_Accumulate@Base @SVER@ + _PyAccu_Destroy@Base @SVER@ + _PyAccu_Finish@Base @SVER@ + _PyAccu_FinishAsList@Base @SVER@ + _PyAccu_Init@Base @SVER@ + _PyArg_NoKeywords@Base @SVER@ + _PyArg_NoPositional@Base @SVER@ + _PyArg_ParseTupleAndKeywords_SizeT@Base @SVER@ + _PyArg_ParseTuple_SizeT@Base @SVER@ + _PyArg_Parse_SizeT@Base @SVER@ + _PyArg_VaParseTupleAndKeywords_SizeT@Base @SVER@ + _PyArg_VaParse_SizeT@Base @SVER@ + _PyBuiltin_Init@Base @SVER@ + _PyByteArray_empty_string@Base @SVER@ + _PyBytesIOBuffer_Type@Base @SVER@ + _PyBytes_Join@Base @SVER@ + _PyBytes_Resize@Base @SVER@ + _PyCFunction_DebugMallocStats@Base @SVER@ + _PyCapsule_hack@Base @SVER@ + _PyCode_CheckLineNumber@Base @SVER@ + _PyCodecInfo_GetIncrementalDecoder@Base @SVER@ + _PyCodecInfo_GetIncrementalEncoder@Base @SVER@ + _PyCodec_DecodeText@Base @SVER@ + _PyCodec_EncodeText@Base @SVER@ + _PyCodec_Lookup@Base @SVER@ + _PyCodec_LookupTextEncoding@Base @SVER@ + _PyComplex_FormatAdvancedWriter@Base @SVER@ + _PyDebugAllocatorStats@Base @SVER@ + _PyDictKeys_DecRef@Base @SVER@ + _PyDict_Contains@Base @SVER@ + _PyDict_DebugMallocStats@Base @SVER@ + _PyDict_DelItemId@Base @SVER@ + _PyDict_GetItemId@Base @SVER@ + _PyDict_GetItemIdWithError@Base @SVER@ + _PyDict_HasOnlyStringKeys@Base @SVER@ + _PyDict_KeysSize@Base @SVER@ + _PyDict_LoadGlobal@Base @SVER@ + _PyDict_MaybeUntrack@Base @SVER@ + _PyDict_NewKeysForClass@Base @SVER@ + _PyDict_NewPresized@Base @SVER@ + _PyDict_Next@Base @SVER@ + _PyErr_BadInternalCall@Base @SVER@ + _PyErr_SetKeyError@Base @SVER@ + _PyErr_TrySetFromCause@Base @SVER@ + _PyDict_SetItemId@Base @SVER@ + _PyEval_CallTracing@Base @SVER@ + _PyEval_FiniThreads@Base @SVER@ + _PyEval_GetSwitchInterval@Base @SVER@ + _PyEval_SetSwitchInterval@Base @SVER@ + _PyEval_SignalAsyncExc@Base @SVER@ + _PyEval_SliceIndex@Base @SVER@ + _PyExc_Fini@Base @SVER@ + _PyExc_Init@Base @SVER@ + _PyFaulthandler_Fini@Base @SVER@ + _PyFaulthandler_Init@Base @SVER@ + _PyFileIO_closed@Base @SVER@ + _PyFloat_DebugMallocStats@Base @SVER@ + _PyFloat_FormatAdvancedWriter@Base @SVER@ + _PyFloat_Init@Base @SVER@ + _PyFloat_Pack4@Base @SVER@ + _PyFloat_Pack8@Base @SVER@ + _PyFloat_Unpack4@Base @SVER@ + _PyFloat_Unpack8@Base @SVER@ + _PyFrame_DebugMallocStats@Base @SVER@ + _PyFrame_Init@Base @SVER@ + _PyGC_CollectNoFail@Base @SVER@ + _PyGC_Dump@Base @SVER@ + _PyGC_DumpShutdownStats@Base @SVER@ + _PyGC_Fini@Base @SVER@ + _PyGC_generation0@Base @SVER@ + _PyGILState_Fini@Base @SVER@ + _PyGILState_Init@Base @SVER@ + _PyGILState_Reinit@Base @SVER@ + _PyGen_FetchStopIterationValue@Base @SVER@ + _PyGen_Finalize@Base @SVER@ + _PyGen_Send@Base @SVER@ + _PyHash_Fini@Base @SVER@ + _PyIOBase_check_closed@Base @SVER@ + _PyIOBase_check_readable@Base @SVER@ + _PyIOBase_check_seekable@Base @SVER@ + _PyIOBase_check_writable@Base @SVER@ + _PyIOBase_finalize@Base @SVER@ + _PyIO_ConvertSsize_t@Base @SVER@ + _PyIO_Module@Base @SVER@ + _PyIO_empty_bytes@Base @SVER@ + _PyIO_empty_str@Base @SVER@ + _PyIO_find_line_ending@Base @SVER@ + _PyIO_get_locale_module@Base @SVER@ + _PyIO_get_module_state@Base @SVER@ + _PyIO_str_close@Base @SVER@ + _PyIO_str_closed@Base @SVER@ + _PyIO_str_decode@Base @SVER@ + _PyIO_str_encode@Base @SVER@ + _PyIO_str_fileno@Base @SVER@ + _PyIO_str_flush@Base @SVER@ + _PyIO_str_getstate@Base @SVER@ + _PyIO_str_isatty@Base @SVER@ + _PyIO_str_newlines@Base @SVER@ + _PyIO_str_nl@Base @SVER@ + _PyIO_str_read1@Base @SVER@ + _PyIO_str_read@Base @SVER@ + _PyIO_str_readable@Base @SVER@ + _PyIO_str_readall@Base @SVER@ + _PyIO_str_readinto@Base @SVER@ + _PyIO_str_readline@Base @SVER@ + _PyIO_str_reset@Base @SVER@ + _PyIO_str_seek@Base @SVER@ + _PyIO_str_seekable@Base @SVER@ + _PyIO_str_setstate@Base @SVER@ + _PyIO_str_tell@Base @SVER@ + _PyIO_str_truncate@Base @SVER@ + _PyIO_str_writable@Base @SVER@ + _PyIO_str_write@Base @SVER@ + _PyIO_trap_eintr@Base @SVER@ + _PyIO_zero@Base @SVER@ + _PyImportHooks_Init@Base @SVER@ + _PyImportZip_Init@Base @SVER@ + _PyImport_AcquireLock@Base @SVER@ + _PyImport_DynLoadFiletab@Base @SVER@ + _PyImport_FindBuiltin@Base @SVER@ + _PyImport_FindExtensionObject@Base @SVER@ + _PyImport_Fini@Base @SVER@ + _PyImport_FixupBuiltin@Base @SVER@ + _PyImport_FixupExtensionObject@Base @SVER@ + _PyImport_GetDynLoadFunc@Base @SVER@ + _PyImport_Init@Base @SVER@ + _PyImport_Inittab@Base @SVER@ + _PyImport_LoadDynamicModule@Base @SVER@ + _PyImport_ReInitLock@Base @SVER@ + _PyImport_ReleaseLock@Base @SVER@ + _PyIncrementalNewlineDecoder_decode@Base @SVER@ + _PyList_DebugMallocStats@Base @SVER@ + _PyList_Extend@Base @SVER@ + _PyLong_AsByteArray@Base @SVER@ + _PyLong_AsInt@Base @SVER@ + _PyLong_AsTime_t@Base @SVER@ + _PyLong_Copy@Base @SVER@ + _PyLong_DigitValue@Base @SVER@ + _PyLong_DivmodNear@Base @SVER@ + _PyLong_Format@Base @SVER@ + _PyLong_FormatAdvancedWriter@Base @SVER@ + _PyLong_FormatWriter@Base @SVER@ + _PyLong_Frexp@Base @SVER@ + _PyLong_FromByteArray@Base @SVER@ + _PyLong_FromBytes@Base @SVER@ + _PyLong_FromGid@Base @SVER@ + _PyLong_FromNbInt@Base @SVER@ + _PyLong_FromTime_t@Base @SVER@ + _PyLong_FromUid@Base @SVER@ + _PyLong_Init@Base @SVER@ + _PyLong_New@Base @SVER@ + _PyLong_NumBits@Base @SVER@ + _PyLong_Sign@Base @SVER@ + _PyManagedBuffer_Type@Base @SVER@ + _PyMem_RawStrdup@Base @SVER@ + _PyMem_Strdup@Base @SVER@ + _PyMethodWrapper_Type@Base @SVER@ + _PyMethod_DebugMallocStats@Base @SVER@ + _PyModule_Clear@Base @SVER@ + _PyModule_ClearDict@Base @SVER@ + _PyNamespace_New@Base @SVER@ + _PyNamespace_Type@Base @SVER@ + _PyNode_SizeOf@Base @SVER@ + _PyNone_Type@Base @SVER@ + _PyNotImplemented_Type@Base @SVER@ + _PyOS_GetOpt@Base @SVER@ + _PyOS_IsMainThread@Base @SVER@ + _PyOS_ReadlineTState@Base @SVER@ + _PyOS_ResetGetOpt@Base @SVER@ + _PyOS_URandom@Base @SVER@ + _PyOS_mystrnicmp_hack@Base @SVER@ + _PyOS_optarg@Base @SVER@ + _PyOS_opterr@Base @SVER@ + _PyOS_optind@Base @SVER@ + _PyObjectDict_SetItem@Base @SVER@ + _PyObject_CallFunction_SizeT@Base @SVER@ + _PyObject_CallMethodId@Base @SVER@ + _PyObject_CallMethodIdObjArgs@Base @SVER@ + _PyObject_CallMethodId_SizeT@Base @SVER@ + _PyObject_CallMethod_SizeT@Base @SVER@ + _PyObject_DebugMallocStats@Base @SVER@ + _PyObject_DebugTypeStats@Base @SVER@ + _PyObject_Dump@Base @SVER@ + _PyObject_GC_Malloc@Base @SVER@ + _PyObject_GC_New@Base @SVER@ + _PyObject_GC_NewVar@Base @SVER@ + _PyObject_GC_Resize@Base @SVER@ + _PyObject_GenericGetAttrWithDict@Base @SVER@ + _PyObject_GenericSetAttrWithDict@Base @SVER@ + _PyObject_GetAttrId@Base @SVER@ + _PyObject_GetBuiltin@Base @SVER@ + _PyObject_GetDictPtr@Base @SVER@ + _PyObject_HasAttrId@Base @SVER@ + _PyObject_HasLen@Base @SVER@ + _PyObject_IsAbstract@Base @SVER@ + _PyObject_LookupSpecial@Base @SVER@ + _PyObject_New@Base @SVER@ + _PyObject_NewVar@Base @SVER@ + _PyObject_NextNotImplemented@Base @SVER@ + _PyObject_RealIsInstance@Base @SVER@ + _PyObject_RealIsSubclass@Base @SVER@ + _PyObject_SetAttrId@Base @SVER@ + _PyParser_Grammar@Base @SVER@ + _PyParser_TokenNames@Base @SVER@ + _PyRandom_Fini@Base @SVER@ + _PyRandom_Init@Base @SVER@ + _PySequence_BytesToCharpArray@Base @SVER@ + _PySequence_IterSearch@Base @SVER@ + _PySet_Dummy@Base @SVER@ + _PySet_NextEntry@Base @SVER@ + _PySet_Update@Base @SVER@ + _PySlice_FromIndices@Base @SVER@ + _PySlice_GetLongIndices@Base @SVER@ + _PyState_AddModule@Base @SVER@ + _PyState_ClearModules@Base @SVER@ + _PyStructSequence_Init@Base @SVER@ + _PySys_GetObjectId@Base @SVER@ + _PySys_ImplCacheTag@Base @SVER@ + _PySys_ImplName@Base @SVER@ + _PySys_Init@Base @SVER@ + _PySys_SetObjectId@Base @SVER@ + _PyThreadState_Current@Base @SVER@ + _PyThreadState_DeleteExcept@Base @SVER@ + _PyThreadState_GetFrame@Base @SVER@ + _PyThreadState_Init@Base @SVER@ + _PyThreadState_Prealloc@Base @SVER@ + _PyThread_CurrentFrames@Base @SVER@ + _PyTime_Init@Base @SVER@ + _PyTime_ObjectToTime_t@Base @SVER@ + _PyTime_ObjectToTimespec@Base @SVER@ + _PyTime_ObjectToTimeval@Base @SVER@ + _PyTime_gettimeofday@Base @SVER@ + _PyTime_gettimeofday_info@Base @SVER@ + _PyTraceMalloc_Init@Base @SVER@ + _PyTraceMalloc_Fini@Base @SVER@ + _PyTrash_delete_later@Base @SVER@ + _PyTrash_delete_nesting@Base @SVER@ + _PyTrash_deposit_object@Base @SVER@ + _PyTrash_destroy_chain@Base @SVER@ + _PyTrash_thread_deposit_object@Base @SVER@ + _PyTrash_thread_destroy_chain@Base @SVER@ + _PyTuple_DebugMallocStats@Base @SVER@ + _PyTuple_MaybeUntrack@Base @SVER@ + _PyTuple_Resize@Base @SVER@ + _PyType_CalculateMetaclass@Base @SVER@ + _PyType_Fini@Base @SVER@ + _PyType_GetDocFromInternalDoc@Base @SVER@ + _PyType_GetTextSignatureFromInternalDoc@Base @SVER@ + _PyType_Lookup@Base @SVER@ + _PyType_LookupId@Base @SVER@ + _PyUnicodeTranslateError_Create@Base @SVER@ + _PyUnicodeWriter_Dealloc@Base @SVER@ + _PyUnicodeWriter_Finish@Base @SVER@ + _PyUnicodeWriter_Init@Base @SVER@ + _PyUnicodeWriter_PrepareInternal@Base @SVER@ + _PyUnicodeWriter_WriteASCIIString@Base @SVER@ + _PyUnicodeWriter_WriteChar@Base @SVER@ + _PyUnicodeWriter_WriteLatin1String@Base @SVER@ + _PyUnicodeWriter_WriteStr@Base @SVER@ + _PyUnicodeWriter_WriteSubstring@Base @SVER@ + _PyUnicode_AsASCIIString@Base @SVER@ + _PyUnicode_AsKind@Base @SVER@ + _PyUnicode_AsLatin1String@Base @SVER@ + _PyUnicode_AsUTF8String@Base @SVER@ + _PyUnicode_BidirectionalNames@Base @SVER@ + _PyUnicode_CategoryNames@Base @SVER@ + _PyUnicode_ClearStaticStrings@Base @SVER@ + _PyUnicode_CompareWithId@Base @SVER@ + _PyUnicode_Copy@Base @SVER@ + _PyUnicode_Database_Records@Base @SVER@ + _PyUnicode_DecodeUnicodeInternal@Base @SVER@ + _PyUnicode_EastAsianWidthNames@Base @SVER@ + _PyUnicode_EncodeCharmap@Base @SVER@ + _PyUnicode_EncodeUTF16@Base @SVER@ + _PyUnicode_EncodeUTF32@Base @SVER@ + _PyUnicode_EncodeUTF7@Base @SVER@ + _PyUnicode_ExtendedCase@Base @SVER@ + _PyUnicode_FastCopyCharacters@Base @SVER@ + _PyUnicode_FastFill@Base @SVER@ + _PyUnicode_FindMaxChar@Base @SVER@ + _PyUnicode_Fini@Base @SVER@ + _PyUnicode_FormatAdvancedWriter@Base @SVER@ + _PyUnicode_FromASCII@Base @SVER@ + _PyUnicode_FromId@Base @SVER@ + _PyUnicode_Init@Base @SVER@ + _PyUnicode_HasNULChars@Base @SVER@ + _PyUnicode_InsertThousandsGrouping@Base @SVER@ + _PyUnicode_IsAlpha@Base @SVER@ + _PyUnicode_IsCaseIgnorable@Base @SVER@ + _PyUnicode_IsCased@Base @SVER@ + _PyUnicode_IsDecimalDigit@Base @SVER@ + _PyUnicode_IsDigit@Base @SVER@ + _PyUnicode_IsLinebreak@Base @SVER@ + _PyUnicode_IsLowercase@Base @SVER@ + _PyUnicode_IsNumeric@Base @SVER@ + _PyUnicode_IsPrintable@Base @SVER@ + _PyUnicode_IsTitlecase@Base @SVER@ + _PyUnicode_IsUppercase@Base @SVER@ + _PyUnicode_IsWhitespace@Base @SVER@ + _PyUnicode_IsXidContinue@Base @SVER@ + _PyUnicode_IsXidStart@Base @SVER@ + _PyUnicode_Ready@Base @SVER@ + _PyUnicode_ToDecimalDigit@Base @SVER@ + _PyUnicode_ToDigit@Base @SVER@ + _PyUnicode_ToFoldedFull@Base @SVER@ + _PyUnicode_ToLowerFull@Base @SVER@ + _PyUnicode_ToLowercase@Base @SVER@ + _PyUnicode_ToNumeric@Base @SVER@ + _PyUnicode_ToTitleFull@Base @SVER@ + _PyUnicode_ToTitlecase@Base @SVER@ + _PyUnicode_ToUpperFull@Base @SVER@ + _PyUnicode_ToUppercase@Base @SVER@ + _PyUnicode_TransformDecimalAndSpaceToASCII@Base @SVER@ + _PyUnicode_TranslateCharmap@Base @SVER@ + _PyUnicode_TypeRecords@Base @SVER@ + _PyUnicode_XStrip@Base @SVER@ + _PyWarnings_Init@Base @SVER@ + _PyWeakref_CallableProxyType@Base @SVER@ + _PyWeakref_ClearRef@Base @SVER@ + _PyWeakref_GetWeakrefCount@Base @SVER@ + _PyWeakref_ProxyType@Base @SVER@ + _PyWeakref_RefType@Base @SVER@ + _Py_Assert@Base @SVER@ + _Py_Assign@Base @SVER@ + _Py_Attribute@Base @SVER@ + _Py_AugAssign@Base @SVER@ + _Py_BinOp@Base @SVER@ + _Py_BoolOp@Base @SVER@ + _Py_Break@Base @SVER@ + _Py_BreakPoint@Base @SVER@ + _Py_BuildValue_SizeT@Base @SVER@ + _Py_Bytes@Base @SVER@ + _Py_Call@Base @SVER@ + _Py_CheckRecursionLimit@Base @SVER@ + _Py_CheckRecursiveCall@Base @SVER@ + _Py_ClassDef@Base @SVER@ + _Py_Compare@Base @SVER@ + _Py_Continue@Base @SVER@ + _Py_Dealloc@Base @SVER@ + _Py_Delete@Base @SVER@ + _Py_Dict@Base @SVER@ + _Py_DictComp@Base @SVER@ + _Py_DisplaySourceLine@Base @SVER@ + _Py_DumpTraceback@Base @SVER@ + _Py_DumpTracebackThreads@Base @SVER@ + _Py_Ellipsis@Base @SVER@ + _Py_EllipsisObject@Base @SVER@ + _Py_ExceptHandler@Base @SVER@ + _Py_Expr@Base @SVER@ + _Py_Expression@Base @SVER@ + _Py_ExtSlice@Base @SVER@ + _Py_FalseStruct@Base @SVER@ + _Py_Finalizing@Base @SVER@ + _Py_For@Base @SVER@ + _Py_FreeCharPArray@Base @SVER@ + _Py_FunctionDef@Base @SVER@ + _Py_GeneratorExp@Base @SVER@ + _Py_GetAllocatedBlocks@Base @SVER@ + _Py_Global@Base @SVER@ + _Py_Gid_Converter@Base @SVER@ + _Py_HashBytes@Base @SVER@ + _Py_HashDouble@Base @SVER@ + _Py_HashPointer@Base @SVER@ + _Py_HashSecret@Base @SVER@ + _Py_If@Base @SVER@ + _Py_IfExp@Base @SVER@ + _Py_Import@Base @SVER@ + _Py_ImportFrom@Base @SVER@ + _Py_Index@Base @SVER@ + _Py_InitializeEx_Private@Base @SVER@ + _Py_Interactive@Base @SVER@ + _Py_Lambda@Base @SVER@ + _Py_List@Base @SVER@ + _Py_ListComp@Base @SVER@ + _Py_M__importlib@Base @SVER@ + _Py_Mangle@Base @SVER@ + _Py_Module@Base @SVER@ + _Py_Name@Base @SVER@ + _Py_NameConstant@Base @SVER@ + _Py_NoneStruct@Base @SVER@ + _Py_Nonlocal@Base @SVER@ + _Py_NotImplementedStruct@Base @SVER@ + _Py_Num@Base @SVER@ + _Py_PackageContext@Base @SVER@ + _Py_Pass@Base @SVER@ + _Py_PyAtExit@Base @SVER@ + _Py_Raise@Base @SVER@ + _Py_ReadyTypes@Base @SVER@ + _Py_ReleaseInternedUnicodeStrings@Base @SVER@ + _Py_RestoreSignals@Base @SVER@ + _Py_Return@Base @SVER@ + _Py_Set@Base @SVER@ + _Py_SetComp@Base @SVER@ + _Py_Slice@Base @SVER@ + _Py_Starred@Base @SVER@ + _Py_Str@Base @SVER@ + _Py_Subscript@Base @SVER@ + _Py_Suite@Base @SVER@ + _Py_SwappedOp@Base @SVER@ + _Py_TrueStruct@Base @SVER@ + _Py_Try@Base @SVER@ + _Py_Tuple@Base @SVER@ + _Py_Uid_Converter@Base @SVER@ + _Py_UnaryOp@Base @SVER@ + _Py_VaBuildValue_SizeT@Base @SVER@ + _Py_While@Base @SVER@ + _Py_With@Base @SVER@ + _Py_Yield@Base @SVER@ + _Py_YieldFrom@Base @SVER@ + _Py_abstract_hack@Base @SVER@ + _Py_acosh@Base @SVER@ + _Py_add_one_to_index_C@Base @SVER@ + _Py_add_one_to_index_F@Base @SVER@ + _Py_addarc@Base @SVER@ + _Py_addbit@Base @SVER@ + _Py_adddfa@Base @SVER@ + _Py_addfirstsets@Base @SVER@ + _Py_addlabel@Base @SVER@ + _Py_addstate@Base @SVER@ + _Py_alias@Base @SVER@ + _Py_arg@Base @SVER@ + _Py_arguments@Base @SVER@ + _Py_ascii_whitespace@Base @SVER@ + _Py_asdl_int_seq_new@Base @SVER@ + _Py_asdl_seq_new@Base @SVER@ + _Py_asinh@Base @SVER@ + _Py_atanh@Base @SVER@ + _Py_bytes_capitalize@Base @SVER@ + _Py_bytes_isalnum@Base @SVER@ + _Py_bytes_isalpha@Base @SVER@ + _Py_bytes_isdigit@Base @SVER@ + _Py_bytes_islower@Base @SVER@ + _Py_bytes_isspace@Base @SVER@ + _Py_bytes_istitle@Base @SVER@ + _Py_bytes_isupper@Base @SVER@ + _Py_bytes_lower@Base @SVER@ + _Py_bytes_maketrans@Base @SVER@ + _Py_bytes_swapcase@Base @SVER@ + _Py_bytes_title@Base @SVER@ + _Py_bytes_upper@Base @SVER@ + _Py_c_abs@Base @SVER@ + _Py_c_diff@Base @SVER@ + _Py_c_neg@Base @SVER@ + _Py_c_pow@Base @SVER@ + _Py_c_prod@Base @SVER@ + _Py_c_quot@Base @SVER@ + _Py_c_sum@Base @SVER@ + _Py_capitalize__doc__@Base @SVER@ + _Py_char2wchar@Base @SVER@ + _Py_comprehension@Base @SVER@ + _Py_ctype_table@Base @SVER@ + _Py_ctype_tolower@Base @SVER@ + _Py_ctype_toupper@Base @SVER@ + _Py_delbitset@Base @SVER@ + _Py_device_encoding@Base @SVER@ + (arch=!m68k)_Py_dg_dtoa@Base @SVER@ + (arch=!m68k)_Py_dg_freedtoa@Base @SVER@ + (arch=!m68k)_Py_dg_infinity@Base @SVER@ + (arch=!m68k)_Py_dg_stdnan@Base @SVER@ + (arch=!m68k)_Py_dg_strtod@Base @SVER@ + _Py_dup@Base @SVER@ + _Py_expm1@Base @SVER@ + _Py_findlabel@Base @SVER@ + _Py_fopen@Base @SVER@ + _Py_fopen_obj@Base @SVER@ + (arch=i386 lpia m68k)_Py_force_double@Base @SVER@ + (arch=amd64 i386 lpia)_Py_get_387controlword@Base @SVER@ + _Py_get_inheritable@Base @SVER@ + _Py_hashtable_clear@Base @SVER@ + _Py_hashtable_compare_direct@Base @SVER@ + _Py_hashtable_copy@Base @SVER@ + _Py_hashtable_delete@Base @SVER@ + _Py_hashtable_destroy@Base @SVER@ + _Py_hashtable_foreach@Base @SVER@ + _Py_hashtable_get@Base @SVER@ + _Py_hashtable_get_entry@Base @SVER@ + _Py_hashtable_hash_int@Base @SVER@ + _Py_hashtable_hash_ptr@Base @SVER@ + _Py_hashtable_new@Base @SVER@ + _Py_hashtable_new_full@Base @SVER@ + _Py_hashtable_pop@Base @SVER@ + _Py_hashtable_set@Base @SVER@ + _Py_hashtable_size@Base @SVER@ + _Py_hgidentifier@Base @SVER@ + _Py_hgversion@Base @SVER@ + _Py_isalnum__doc__@Base @SVER@ + _Py_isalpha__doc__@Base @SVER@ + _Py_isdigit__doc__@Base @SVER@ + _Py_islower__doc__@Base @SVER@ + _Py_isspace__doc__@Base @SVER@ + _Py_istitle__doc__@Base @SVER@ + _Py_isupper__doc__@Base @SVER@ + _Py_keyword@Base @SVER@ + _Py_log1p@Base @SVER@ + _Py_lower__doc__@Base @SVER@ + _Py_maketrans__doc__@Base @SVER@ + _Py_mergebitset@Base @SVER@ + _Py_meta_grammar@Base @SVER@ + _Py_newbitset@Base @SVER@ + _Py_newgrammar@Base @SVER@ + _Py_normalize_encoding@Base @SVER@ + _Py_open@Base @SVER@ + _Py_open_cloexec_works@Base @SVER@ + _Py_parse_inf_or_nan@Base @SVER@ + _Py_pgen@Base @SVER@ + _Py_samebitset@Base @SVER@ + (arch=amd64 i386 lpia)_Py_set_387controlword@Base @SVER@ + _Py_set_inheritable@Base @SVER@ + _Py_stat@Base @SVER@ + _Py_swapcase__doc__@Base @SVER@ + _Py_title__doc__@Base @SVER@ + _Py_translatelabels@Base @SVER@ + _Py_upper__doc__@Base @SVER@ + _Py_wchar2char@Base @SVER@ + _Py_wfopen@Base @SVER@ + _Py_wgetcwd@Base @SVER@ + _Py_withitem@Base @SVER@ + _Py_wreadlink@Base @SVER@ + _Py_wrealpath@Base @SVER@ + _Py_wstat@Base @SVER@ + + (optional|regex)"^_ctypes_.*@Base$" @SVER@ + (optional|regex)"^ffi_type_.*@Base$" @SVER@ + (optional|regex)"^ffi_closure_.*@Base$" @SVER@ + + (optional|regex)"^PyInit_.*@Base$" @SVER@ --- python3.4-3.4.1.orig/debian/locale-gen +++ python3.4-3.4.1/debian/locale-gen @@ -0,0 +1,31 @@ +#!/bin/sh + +LOCPATH=`pwd`/locales +export LOCPATH + +[ -d $LOCPATH ] || mkdir -p $LOCPATH + +umask 022 + +echo "Generating locales..." +while read locale charset; do + case $locale in \#*) continue;; esac + [ -n "$locale" -a -n "$charset" ] || continue + echo -n " `echo $locale | sed 's/\([^.\@]*\).*/\1/'`" + echo -n ".$charset" + echo -n `echo $locale | sed 's/\([^\@]*\)\(\@.*\)*/\2/'` + echo -n '...' + if [ -f $LOCPATH/$locale ]; then + input=$locale + else + input=`echo $locale | sed 's/\([^.]*\)[^@]*\(.*\)/\1\2/'` + fi + localedef -i $input -c -f $charset $LOCPATH/$locale #-A /etc/locale.alias + echo ' done'; \ +done < +# elif defined(__x86_64__) && defined(__ILP32__) +# include +# elif defined(__i386__) +# include +# elif defined(__aarch64__) && defined(__AARCH64EL__) +# include +# elif defined(__alpha__) +# include +# elif defined(__ARM_EABI__) && defined(__ARM_PCS_VFP) +# include +# elif defined(__ARM_EABI__) && !defined(__ARM_PCS_VFP) +# include +# elif defined(__hppa__) +# include +# elif defined(__ia64__) +# include +# elif defined(__m68k__) && !defined(__mcoldfire__) +# include +# elif defined(__mips_hard_float) && defined(_MIPSEL) +# if _MIPS_SIM == _ABIO32 +# include +# elif _MIPS_SIM == _ABIN32 +# include +# elif _MIPS_SIM == _ABI64 +# include +# else +# error unknown multiarch location for @header@ +# endif +# elif defined(__mips_hard_float) +# if _MIPS_SIM == _ABIO32 +# include +# elif _MIPS_SIM == _ABIN32 +# include +# elif _MIPS_SIM == _ABI64 +# include +# else +# error unknown multiarch location for @header@ +# endif +# elif defined(__or1k__) +# include +# elif defined(__powerpc__) && defined(__SPE__) +# include +# elif defined(__powerpc64__) +# if defined(__LITTLE_ENDIAN__) +# include +# else +# include +# endif +# elif defined(__powerpc__) +# include +# elif defined(__s390x__) +# include +# elif defined(__s390__) +# include +# elif defined(__sh__) && defined(__LITTLE_ENDIAN__) +# include +# elif defined(__sparc__) && defined(__arch64__) +# include +# elif defined(__sparc__) +# include +# else +# error unknown multiarch location for @header@ +# endif +#elif defined(__FreeBSD_kernel__) +# if defined(__LP64__) +# include +# elif defined(__i386__) +# include +# else +# error unknown multiarch location for @header@ +# endif +#elif defined(__gnu_hurd__) +# include +#else +# error unknown multiarch location for @header@ +#endif --- python3.4-3.4.1.orig/debian/patches/bdist-wininst-notfound.diff +++ python3.4-3.4.1/debian/patches/bdist-wininst-notfound.diff @@ -0,0 +1,17 @@ +# DP: suggest installation of the pythonX.Y-dev package, if bdist_wininst +# DP: cannot find the wininst-* files. + +--- a/Lib/distutils/command/bdist_wininst.py ++++ b/Lib/distutils/command/bdist_wininst.py +@@ -342,7 +342,10 @@ + sfix = '' + + filename = os.path.join(directory, "wininst-%.1f%s.exe" % (bv, sfix)) +- f = open(filename, "rb") ++ try: ++ f = open(filename, "rb") ++ except IOError as e: ++ raise DistutilsFileError(str(e) + ', %s not included in the Debian packages.' % filename) + try: + return f.read() + finally: --- python3.4-3.4.1.orig/debian/patches/ctypes-arm.diff +++ python3.4-3.4.1/debian/patches/ctypes-arm.diff @@ -0,0 +1,34 @@ +Index: b/Lib/ctypes/util.py +=================================================================== +--- a/Lib/ctypes/util.py ++++ b/Lib/ctypes/util.py +@@ -201,16 +201,27 @@ elif os.name == "posix": + + def _findSoname_ldconfig(name): + import struct ++ # XXX this code assumes that we know all unames and that a single ++ # ABI is supported per uname; instead we should find what the ++ # ABI is (e.g. check ABI of current process) or simply ask libc ++ # to load the library for us ++ uname = os.uname() ++ # ARM has a variety of unames, e.g. armv7l ++ if uname.machine.startswith("arm"): ++ machine = "arm" + if struct.calcsize('l') == 4: +- machine = os.uname().machine + '-32' ++ machine = uname.machine + '-32' + else: +- machine = os.uname().machine + '-64' ++ machine = uname.machine + '-64' + mach_map = { + 'x86_64-64': 'libc6,x86-64', + 'ppc64-64': 'libc6,64bit', + 'sparc64-64': 'libc6,64bit', + 's390x-64': 'libc6,64bit', + 'ia64-64': 'libc6,IA-64', ++ # this actually breaks on biarch or multiarch as the first ++ # library wins; uname doesn't tell us which ABI we're using ++ 'arm-32': 'libc6(,hard-float)?', + } + abi_type = mach_map.get(machine, 'libc6') + --- python3.4-3.4.1.orig/debian/patches/deb-locations.diff +++ python3.4-3.4.1/debian/patches/deb-locations.diff @@ -0,0 +1,30 @@ +# DP: adjust locations of directories to debian policy + +Index: b/Lib/pydoc.py +=================================================================== +--- a/Lib/pydoc.py ++++ b/Lib/pydoc.py +@@ -28,6 +28,10 @@ to a file named ".html". + + Module docs for core modules are assumed to be in + ++ /usr/share/doc/pythonX.Y/html/library ++ ++if the pythonX.Y-doc package is installed or in ++ + http://docs.python.org/X.Y/library/ + + This can be overridden by setting the PYTHONDOCS environment variable +Index: b/Misc/python.man +=================================================================== +--- a/Misc/python.man ++++ b/Misc/python.man +@@ -330,7 +330,7 @@ exception). Error messages are written + These are subject to difference depending on local installation + conventions; ${prefix} and ${exec_prefix} are installation-dependent + and should be interpreted as for GNU software; they may be the same. +-The default for both is \fI/usr/local\fP. ++On Debian GNU/{Hurd,Linux} the default for both is \fI/usr\fP. + .IP \fI${exec_prefix}/bin/python\fP + Recommended location of the interpreter. + .PP --- python3.4-3.4.1.orig/debian/patches/deb-setup.diff +++ python3.4-3.4.1/debian/patches/deb-setup.diff @@ -0,0 +1,33 @@ +# DP: Don't include /usr/local/include and /usr/local/lib as gcc search paths + +Index: b/setup.py +=================================================================== +--- a/setup.py ++++ b/setup.py +@@ -240,8 +240,10 @@ + # unfortunately, distutils doesn't let us provide separate C and C++ + # compilers + if compiler is not None: +- (ccshared,cflags) = sysconfig.get_config_vars('CCSHARED','CFLAGS') +- args['compiler_so'] = compiler + ' ' + ccshared + ' ' + cflags ++ (ccshared, cppflags, cflags) = \ ++ sysconfig.get_config_vars('CCSHARED', 'CPPFLAGS', 'CFLAGS') ++ cppflags = ' '.join([f for f in cppflags.split() if not f.startswith('-I')]) ++ args['compiler_so'] = compiler + ' ' + ccshared + ' ' + cppflags + ' ' + cflags + self.compiler.set_executables(**args) + + build_ext.build_extensions(self) +@@ -441,12 +443,7 @@ + os.unlink(tmpfile) + + def detect_modules(self): +- # Ensure that /usr/local is always used, but the local build +- # directories (i.e. '.' and 'Include') must be first. See issue +- # 10520. +- if not cross_compiling: +- add_dir_to_list(self.compiler.library_dirs, '/usr/local/lib') +- add_dir_to_list(self.compiler.include_dirs, '/usr/local/include') ++ # On Debian /usr/local is always used, so we don't include it twice + # only change this for cross builds for 3.3, issues on Mageia + if cross_compiling: + self.add_gcc_paths() --- python3.4-3.4.1.orig/debian/patches/disable-sem-check.diff +++ python3.4-3.4.1/debian/patches/disable-sem-check.diff @@ -0,0 +1,38 @@ +# DP: Assume working semaphores, don't rely on running kernel for the check. + +Index: b/configure.ac +=================================================================== +--- a/configure.ac ++++ b/configure.ac +@@ -3930,8 +3930,13 @@ int main(void) { + AC_MSG_RESULT($ac_cv_posix_semaphores_enabled) + if test $ac_cv_posix_semaphores_enabled = no + then +- AC_DEFINE(POSIX_SEMAPHORES_NOT_ENABLED, 1, +- [Define if POSIX semaphores aren't enabled on your system]) ++ case $ac_sys_system in ++ Linux*) # assume yes, see https://launchpad.net/bugs/630511 ++ ;; ++ *) ++ AC_DEFINE(POSIX_SEMAPHORES_NOT_ENABLED, 1, ++ [Define if POSIX semaphores aren't enabled on your system]) ++ esac + fi + + # Multiprocessing check for broken sem_getvalue +@@ -3966,8 +3971,13 @@ int main(void){ + AC_MSG_RESULT($ac_cv_broken_sem_getvalue) + if test $ac_cv_broken_sem_getvalue = yes + then +- AC_DEFINE(HAVE_BROKEN_SEM_GETVALUE, 1, +- [define to 1 if your sem_getvalue is broken.]) ++ case $ac_sys_system in ++ Linux*) # assume yes, see https://launchpad.net/bugs/630511 ++ ;; ++ *) ++ AC_DEFINE(HAVE_BROKEN_SEM_GETVALUE, 1, ++ [define to 1 if your sem_getvalue is broken.]) ++ esac + fi + + # determine what size digit to use for Python's longs --- python3.4-3.4.1.orig/debian/patches/disable-some-tests.diff +++ python3.4-3.4.1/debian/patches/disable-some-tests.diff @@ -0,0 +1,14 @@ +# DP: Disable some failing tests we are not interested in + +Index: b/Lib/distutils/tests/test_build_ext.py +=================================================================== +--- a/Lib/distutils/tests/test_build_ext.py ++++ b/Lib/distutils/tests/test_build_ext.py +@@ -91,6 +91,7 @@ + build_ext.USER_BASE = self.old_user_base + super(BuildExtTestCase, self).tearDown() + ++ @unittest.skip('Skipping failing Solaris test') + def test_solaris_enable_shared(self): + dist = Distribution({'name': 'xx'}) + cmd = build_ext(dist) --- python3.4-3.4.1.orig/debian/patches/distutils-init.diff +++ python3.4-3.4.1/debian/patches/distutils-init.diff @@ -0,0 +1,60 @@ +# DP: Use _sysconfigdata.py in distutils to initialize distutils + +Index: b/Lib/distutils/sysconfig.py +=================================================================== +--- a/Lib/distutils/sysconfig.py ++++ b/Lib/distutils/sysconfig.py +@@ -453,49 +453,11 @@ _config_vars = None + + def _init_posix(): + """Initialize the module as appropriate for POSIX systems.""" +- g = {} +- # load the installed Makefile: +- try: +- filename = get_makefile_filename() +- parse_makefile(filename, g) +- except OSError as msg: +- my_msg = "invalid Python installation: unable to open %s" % filename +- if hasattr(msg, "strerror"): +- my_msg = my_msg + " (%s)" % msg.strerror +- +- raise DistutilsPlatformError(my_msg) +- +- # load the installed pyconfig.h: +- try: +- filename = get_config_h_filename() +- with open(filename) as file: +- parse_config_h(file, g) +- except OSError as msg: +- my_msg = "invalid Python installation: unable to open %s" % filename +- if hasattr(msg, "strerror"): +- my_msg = my_msg + " (%s)" % msg.strerror +- +- raise DistutilsPlatformError(my_msg) +- +- # On AIX, there are wrong paths to the linker scripts in the Makefile +- # -- these paths are relative to the Python source, but when installed +- # the scripts are in another directory. +- if python_build: +- g['LDSHARED'] = g['BLDSHARED'] +- +- elif get_python_version() < '2.1': +- # The following two branches are for 1.5.2 compatibility. +- if sys.platform == 'aix4': # what about AIX 3.x ? +- # Linker script is in the config directory, not in Modules as the +- # Makefile says. +- python_lib = get_python_lib(standard_lib=1) +- ld_so_aix = os.path.join(python_lib, 'config', 'ld_so_aix') +- python_exp = os.path.join(python_lib, 'config', 'python.exp') +- +- g['LDSHARED'] = "%s %s -bI:%s" % (ld_so_aix, g['CC'], python_exp) +- ++ # _sysconfigdata is generated at build time, see the sysconfig module ++ from _sysconfigdata import build_time_vars + global _config_vars +- _config_vars = g ++ _config_vars = {} ++ _config_vars.update(build_time_vars) + + + def _init_nt(): --- python3.4-3.4.1.orig/debian/patches/distutils-install-layout.diff +++ python3.4-3.4.1/debian/patches/distutils-install-layout.diff @@ -0,0 +1,260 @@ +# DP: distutils: Add an option --install-layout=deb, which +# DP: - installs into $prefix/dist-packages instead of $prefix/site-packages. +# DP: - doesn't encode the python version into the egg name. + +Index: b/Lib/distutils/command/install_egg_info.py +=================================================================== +--- a/Lib/distutils/command/install_egg_info.py ++++ b/Lib/distutils/command/install_egg_info.py +@@ -14,18 +14,38 @@ class install_egg_info(Command): + description = "Install package's PKG-INFO metadata as an .egg-info file" + user_options = [ + ('install-dir=', 'd', "directory to install to"), ++ ('install-layout', None, "custom installation layout"), + ] + + def initialize_options(self): + self.install_dir = None ++ self.install_layout = None ++ self.prefix_option = None + + def finalize_options(self): + self.set_undefined_options('install_lib',('install_dir','install_dir')) +- basename = "%s-%s-py%s.egg-info" % ( +- to_filename(safe_name(self.distribution.get_name())), +- to_filename(safe_version(self.distribution.get_version())), +- sys.version[:3] +- ) ++ self.set_undefined_options('install',('install_layout','install_layout')) ++ self.set_undefined_options('install',('prefix_option','prefix_option')) ++ if self.install_layout: ++ if not self.install_layout.lower() in ['deb', 'unix']: ++ raise DistutilsOptionError( ++ "unknown value for --install-layout") ++ no_pyver = (self.install_layout.lower() == 'deb') ++ elif self.prefix_option: ++ no_pyver = False ++ else: ++ no_pyver = True ++ if no_pyver: ++ basename = "%s-%s.egg-info" % ( ++ to_filename(safe_name(self.distribution.get_name())), ++ to_filename(safe_version(self.distribution.get_version())) ++ ) ++ else: ++ basename = "%s-%s-py%s.egg-info" % ( ++ to_filename(safe_name(self.distribution.get_name())), ++ to_filename(safe_version(self.distribution.get_version())), ++ sys.version[:3] ++ ) + self.target = os.path.join(self.install_dir, basename) + self.outputs = [self.target] + +Index: b/Lib/distutils/command/install.py +=================================================================== +--- a/Lib/distutils/command/install.py ++++ b/Lib/distutils/command/install.py +@@ -50,6 +50,20 @@ INSTALL_SCHEMES = { + 'scripts': '$base/bin', + 'data' : '$base', + }, ++ 'unix_local': { ++ 'purelib': '$base/local/lib/python$py_version_short/dist-packages', ++ 'platlib': '$platbase/local/lib/python$py_version_short/dist-packages', ++ 'headers': '$base/local/include/python$py_version_short/$dist_name', ++ 'scripts': '$base/local/bin', ++ 'data' : '$base/local', ++ }, ++ 'deb_system': { ++ 'purelib': '$base/lib/python3/dist-packages', ++ 'platlib': '$platbase/lib/python3/dist-packages', ++ 'headers': '$base/include/python$py_version_short/$dist_name', ++ 'scripts': '$base/bin', ++ 'data' : '$base', ++ }, + 'unix_home': { + 'purelib': '$base/lib/python', + 'platlib': '$base/lib/python', +@@ -146,6 +160,9 @@ class install(Command): + + ('record=', None, + "filename in which to record list of installed files"), ++ ++ ('install-layout=', None, ++ "installation layout to choose (known values: deb, unix)"), + ] + + boolean_options = ['compile', 'force', 'skip-build'] +@@ -166,6 +183,7 @@ class install(Command): + self.exec_prefix = None + self.home = None + self.user = 0 ++ self.prefix_option = None + + # These select only the installation base; it's up to the user to + # specify the installation scheme (currently, that means supplying +@@ -187,6 +205,9 @@ class install(Command): + self.install_userbase = USER_BASE + self.install_usersite = USER_SITE + ++ # enable custom installation, known values: deb ++ self.install_layout = None ++ + self.compile = None + self.optimize = None + +@@ -426,6 +447,7 @@ class install(Command): + self.install_base = self.install_platbase = self.home + self.select_scheme("unix_home") + else: ++ self.prefix_option = self.prefix + if self.prefix is None: + if self.exec_prefix is not None: + raise DistutilsOptionError( +@@ -440,7 +462,26 @@ class install(Command): + + self.install_base = self.prefix + self.install_platbase = self.exec_prefix +- self.select_scheme("unix_prefix") ++ if self.install_layout: ++ if self.install_layout.lower() in ['deb']: ++ self.select_scheme("deb_system") ++ elif self.install_layout.lower() in ['unix']: ++ self.select_scheme("unix_prefix") ++ else: ++ raise DistutilsOptionError( ++ "unknown value for --install-layout") ++ elif ((self.prefix_option and ++ os.path.normpath(self.prefix) != '/usr/local') ++ or sys.base_prefix != sys.prefix ++ or 'PYTHONUSERBASE' in os.environ ++ or 'VIRTUAL_ENV' in os.environ ++ or 'real_prefix' in sys.__dict__): ++ self.select_scheme("unix_prefix") ++ else: ++ if os.path.normpath(self.prefix) == '/usr/local': ++ self.prefix = self.exec_prefix = '/usr' ++ self.install_base = self.install_platbase = '/usr' ++ self.select_scheme("unix_local") + + def finalize_other(self): + """Finalizes options for non-posix platforms""" +Index: b/Lib/distutils/sysconfig.py +=================================================================== +--- a/Lib/distutils/sysconfig.py ++++ b/Lib/distutils/sysconfig.py +@@ -134,6 +134,7 @@ def get_python_lib(plat_specific=0, stan + If 'prefix' is supplied, use it instead of sys.base_prefix or + sys.base_exec_prefix -- i.e., ignore 'plat_specific'. + """ ++ is_default_prefix = not prefix or os.path.normpath(prefix) in ('/usr', '/usr/local') + if prefix is None: + if standard_lib: + prefix = plat_specific and BASE_EXEC_PREFIX or BASE_PREFIX +@@ -145,6 +146,12 @@ def get_python_lib(plat_specific=0, stan + "lib", "python" + get_python_version()) + if standard_lib: + return libpython ++ elif (is_default_prefix and ++ 'PYTHONUSERBASE' not in os.environ and ++ 'VIRTUAL_ENV' not in os.environ and ++ 'real_prefix' not in sys.__dict__ and ++ sys.prefix == sys.base_prefix): ++ return os.path.join(prefix, "lib", "python3", "dist-packages") + else: + return os.path.join(libpython, "site-packages") + elif os.name == "nt": +Index: b/Lib/site.py +=================================================================== +--- a/Lib/site.py ++++ b/Lib/site.py +@@ -288,6 +288,13 @@ def addusersitepackages(known_paths): + + if ENABLE_USER_SITE and os.path.isdir(user_site): + addsitedir(user_site, known_paths) ++ if ENABLE_USER_SITE: ++ for dist_libdir in ("lib", "local/lib"): ++ user_site = os.path.join(USER_BASE, dist_libdir, ++ "python" + sys.version[:3], ++ "dist-packages") ++ if os.path.isdir(user_site): ++ addsitedir(user_site, known_paths) + return known_paths + + def getsitepackages(prefixes=None): +Index: b/Lib/test/test_site.py +=================================================================== +--- a/Lib/test/test_site.py ++++ b/Lib/test/test_site.py +@@ -243,12 +243,20 @@ class HelperFunctionsTests(unittest.Test + self.assertEqual(dirs[2], wanted) + elif os.sep == '/': + # OS X non-framwework builds, Linux, FreeBSD, etc +- self.assertEqual(len(dirs), 2) +- wanted = os.path.join('xoxo', 'lib', 'python' + sys.version[:3], +- 'site-packages') ++ self.assertEqual(len(dirs), 4) ++ wanted = os.path.join('xoxo', 'local', 'lib', ++ 'python' + sys.version[:3], ++ 'dist-packages') + self.assertEqual(dirs[0], wanted) +- wanted = os.path.join('xoxo', 'lib', 'site-python') ++ wanted = os.path.join('xoxo', 'lib', ++ 'python3', 'dist-packages') + self.assertEqual(dirs[1], wanted) ++ wanted = os.path.join('xoxo', 'lib', ++ 'python' + sys.version[:3], ++ 'dist-packages') ++ self.assertEqual(dirs[2], wanted) ++ wanted = os.path.join('xoxo', 'lib', 'dist-python') ++ self.assertEqual(dirs[3], wanted) + else: + # other platforms + self.assertEqual(len(dirs), 2) +Index: b/Lib/distutils/tests/test_bdist_dumb.py +=================================================================== +--- a/Lib/distutils/tests/test_bdist_dumb.py ++++ b/Lib/distutils/tests/test_bdist_dumb.py +@@ -85,7 +85,7 @@ class BuildDumbTestCase(support.TempdirM + fp.close() + + contents = sorted(os.path.basename(fn) for fn in contents) +- wanted = ['foo-0.1-py%s.%s.egg-info' % sys.version_info[:2], 'foo.py'] ++ wanted = ['foo-0.1.egg-info', 'foo.py'] + if not sys.dont_write_bytecode: + wanted.append('foo.%s.pyc' % sys.implementation.cache_tag) + self.assertEqual(contents, sorted(wanted)) +Index: b/Lib/distutils/tests/test_install.py +=================================================================== +--- a/Lib/distutils/tests/test_install.py ++++ b/Lib/distutils/tests/test_install.py +@@ -194,7 +194,7 @@ class InstallTestCase(support.TempdirMan + found = [os.path.basename(line) for line in content.splitlines()] + expected = ['hello.py', 'hello.%s.pyc' % sys.implementation.cache_tag, + 'sayhi', +- 'UNKNOWN-0.0.0-py%s.%s.egg-info' % sys.version_info[:2]] ++ 'UNKNOWN-0.0.0.egg-info'] + self.assertEqual(found, expected) + + def test_record_extensions(self): +@@ -224,7 +224,7 @@ class InstallTestCase(support.TempdirMan + + found = [os.path.basename(line) for line in content.splitlines()] + expected = [_make_ext_name('xx'), +- 'UNKNOWN-0.0.0-py%s.%s.egg-info' % sys.version_info[:2]] ++ 'UNKNOWN-0.0.0.egg-info'] + self.assertEqual(found, expected) + + def test_debug_mode(self): +Index: b/Lib/pydoc.py +=================================================================== +--- a/Lib/pydoc.py ++++ b/Lib/pydoc.py +@@ -404,6 +404,7 @@ class Doc: + 'marshal', 'posix', 'signal', 'sys', + '_thread', 'zipimport') or + (file.startswith(basedir) and ++ not file.startswith(os.path.join(basedir, 'dist-packages')) and + not file.startswith(os.path.join(basedir, 'site-packages')))) and + object.__name__ not in ('xml.etree', 'test.pydoc_mod')): + if docloc.startswith("http://"): --- python3.4-3.4.1.orig/debian/patches/distutils-link.diff +++ python3.4-3.4.1/debian/patches/distutils-link.diff @@ -0,0 +1,24 @@ +# DP: Don't add standard library dirs to library_dirs and runtime_library_dirs. + +Index: b/Lib/distutils/unixccompiler.py +=================================================================== +--- a/Lib/distutils/unixccompiler.py ++++ b/Lib/distutils/unixccompiler.py +@@ -153,6 +153,17 @@ + runtime_library_dirs) + libraries, library_dirs, runtime_library_dirs = fixed_args + ++ # filter out standard library paths, which are not explicitely needed ++ # for linking ++ system_libdirs = ['/lib', '/lib64', '/usr/lib', '/usr/lib64'] ++ multiarch = sysconfig.get_config_var("MULTIARCH") ++ if multiarch: ++ system_libdirs.extend(['/lib/%s' % multiarch, '/usr/lib/%s' % multiarch]) ++ library_dirs = [dir for dir in library_dirs ++ if not dir in system_libdirs] ++ runtime_library_dirs = [dir for dir in runtime_library_dirs ++ if not dir in system_libdirs] ++ + lib_opts = gen_lib_options(self, library_dirs, runtime_library_dirs, + libraries) + if not isinstance(output_dir, (str, type(None))): --- python3.4-3.4.1.orig/debian/patches/distutils-sysconfig.diff +++ python3.4-3.4.1/debian/patches/distutils-sysconfig.diff @@ -0,0 +1,45 @@ +# DP: Get CONFIGURE_CFLAGS, CONFIGURE_CPPFLAGS, CONFIGURE_LDFLAGS from +# DP: the python build, when CFLAGS, CPPFLAGS, LDSHARED) are not set +# DP: in the environment. + +Index: b/Lib/distutils/sysconfig.py +=================================================================== +--- a/Lib/distutils/sysconfig.py ++++ b/Lib/distutils/sysconfig.py +@@ -191,9 +191,11 @@ + _osx_support.customize_compiler(_config_vars) + _config_vars['CUSTOMIZED_OSX_COMPILER'] = 'True' + +- (cc, cxx, opt, cflags, ccshared, ldshared, shlib_suffix, ar, ar_flags) = \ ++ (cc, cxx, opt, cflags, ccshared, ldshared, shlib_suffix, ar, ar_flags, ++ configure_cppflags, configure_cflags, configure_ldflags) = \ + get_config_vars('CC', 'CXX', 'OPT', 'CFLAGS', +- 'CCSHARED', 'LDSHARED', 'SHLIB_SUFFIX', 'AR', 'ARFLAGS') ++ 'CCSHARED', 'LDSHARED', 'SHLIB_SUFFIX', 'AR', 'ARFLAGS', ++ 'CONFIGURE_CPPFLAGS', 'CONFIGURE_CFLAGS', 'CONFIGURE_LDFLAGS') + + if 'CC' in os.environ: + newcc = os.environ['CC'] +@@ -214,13 +216,22 @@ + cpp = cc + " -E" # not always + if 'LDFLAGS' in os.environ: + ldshared = ldshared + ' ' + os.environ['LDFLAGS'] ++ elif configure_ldflags: ++ ldshared = ldshared + ' ' + configure_ldflags + if 'CFLAGS' in os.environ: + cflags = opt + ' ' + os.environ['CFLAGS'] + ldshared = ldshared + ' ' + os.environ['CFLAGS'] ++ elif configure_cflags: ++ cflags = opt + ' ' + configure_cflags ++ ldshared = ldshared + ' ' + configure_cflags + if 'CPPFLAGS' in os.environ: + cpp = cpp + ' ' + os.environ['CPPFLAGS'] + cflags = cflags + ' ' + os.environ['CPPFLAGS'] + ldshared = ldshared + ' ' + os.environ['CPPFLAGS'] ++ elif configure_cppflags: ++ cpp = cpp + ' ' + configure_cppflags ++ cflags = cflags + ' ' + configure_cppflags ++ ldshared = ldshared + ' ' + configure_cppflags + if 'AR' in os.environ: + ar = os.environ['AR'] + if 'ARFLAGS' in os.environ: --- python3.4-3.4.1.orig/debian/patches/doc-build.diff +++ python3.4-3.4.1/debian/patches/doc-build.diff @@ -0,0 +1,38 @@ +# DP: Allow docs to be built with Sphinx 0.5.x. + +--- a/Doc/tools/sphinxext/pyspecific.py ++++ b/Doc/tools/sphinxext/pyspecific.py +@@ -171,8 +171,15 @@ + from docutils.io import StringOutput + from docutils.utils import new_document + +-from sphinx.builders import Builder +-from sphinx.writers.text import TextWriter ++try: ++ from sphinx.builders import Builder ++except ImportError: ++ from sphinx.builder import Builder ++ ++try: ++ from sphinx.writers.text import TextWriter ++except ImportError: ++ from sphinx.textwriter import TextWriter + + + class PydocTopicsBuilder(Builder): +--- a/Doc/tools/sphinxext/suspicious.py ++++ b/Doc/tools/sphinxext/suspicious.py +@@ -47,7 +47,12 @@ + import sys + + from docutils import nodes +-from sphinx.builders import Builder ++ ++try: ++ from sphinx.builders import Builder ++except ImportError: ++ from sphinx.builder import Builder ++ + + detect_all = re.compile(r''' + ::(?=[^=])| # two :: (but NOT ::=) --- python3.4-3.4.1.orig/debian/patches/doc-faq.dpatch +++ python3.4-3.4.1/debian/patches/doc-faq.dpatch @@ -0,0 +1,52 @@ +#! /bin/sh -e + +# DP: Mention the FAQ on the documentation index page. + +dir= +if [ $# -eq 3 -a "$2" = '-d' ]; then + pdir="-d $3" + dir="$3/" +elif [ $# -ne 1 ]; then + echo >&2 "usage: `basename $0`: -patch|-unpatch [-d ]" + exit 1 +fi +case "$1" in + -patch) + patch $pdir -f --no-backup-if-mismatch -p0 < $0 + ;; + -unpatch) + patch $pdir -f --no-backup-if-mismatch -R -p0 < $0 + ;; + *) + echo >&2 "usage: `basename $0`: -patch|-unpatch [-d ]" + exit 1 +esac +exit 0 + +--- Doc/html/index.html.in~ 2002-04-01 18:11:27.000000000 +0200 ++++ Doc/html/index.html.in 2003-04-05 13:33:35.000000000 +0200 +@@ -123,6 +123,24 @@ + + + ++ ++ ++   ++

++ ++ ++   ++ ++ ++ + + +

--- python3.4-3.4.1.orig/debian/patches/enable-fpectl.diff +++ python3.4-3.4.1/debian/patches/enable-fpectl.diff @@ -0,0 +1,16 @@ +# DP: Enable the build of the fpectl module. + +Index: b/setup.py +=================================================================== +--- a/setup.py ++++ b/setup.py +@@ -1312,6 +1312,9 @@ class PyBuildExt(build_ext): + else: + missing.append('_curses_panel') + ++ #fpectl fpectlmodule.c ... ++ exts.append( Extension('fpectl', ['fpectlmodule.c']) ) ++ + # Andrew Kuchling's zlib module. Note that some versions of zlib + # 1.1.3 have security problems. See CERT Advisory CA-2002-07: + # http://www.cert.org/advisories/CA-2002-07.html --- python3.4-3.4.1.orig/debian/patches/ensurepip-wheels.diff +++ python3.4-3.4.1/debian/patches/ensurepip-wheels.diff @@ -0,0 +1,125 @@ +--- a/Lib/ensurepip/__init__.py ++++ b/Lib/ensurepip/__init__.py +@@ -1,3 +1,4 @@ ++import glob + import os + import os.path + import pkgutil +@@ -8,13 +9,9 @@ + __all__ = ["version", "bootstrap"] + + +-_SETUPTOOLS_VERSION = "2.1" +- +-_PIP_VERSION = "1.5.6" +- + # pip currently requires ssl support, so we try to provide a nicer + # error message when that is missing (http://bugs.python.org/issue19744) +-_MISSING_SSL_MESSAGE = ("pip {} requires SSL/TLS".format(_PIP_VERSION)) ++_MISSING_SSL_MESSAGE = ("pip requires SSL/TLS") + try: + import ssl + except ImportError: +@@ -26,8 +23,8 @@ + pass + + _PROJECTS = [ +- ("setuptools", _SETUPTOOLS_VERSION), +- ("pip", _PIP_VERSION), ++ "setuptools", ++ "pip", + ] + + +@@ -45,7 +42,10 @@ + """ + Returns a string specifying the bundled version of pip. + """ +- return _PIP_VERSION ++ wheel_names = glob.glob('/usr/share/python-wheels/pip-*.whl') ++ assert len(wheel_names) == 1, wheel_names ++ return os.path.basename(wheel_names[0]).split('-')[1] ++ + + def _disable_pip_configuration_settings(): + # We deliberately ignore all pip environment variables +@@ -87,20 +87,41 @@ + # omit pip and easy_install + os.environ["ENSUREPIP_OPTIONS"] = "install" + ++ # Debian: The bundled wheels are useless to us because we must use ones ++ # crafted from source code in the archive. As we build the virtual ++ # environment, copy the wheels from the system location into the virtual ++ # environment, and place those wheels on sys.path. ++ def copy_wheels(wheels, destdir, paths): ++ for project in wheels: ++ wheel_names = glob.glob( ++ '/usr/share/python-wheels/{}-*.whl'.format(project)) ++ if len(wheel_names) == 0: ++ raise RuntimeError('missing dependency wheel %s' % project) ++ assert len(wheel_names) == 1, wheel_names ++ wheel_name = os.path.basename(wheel_names[0]) ++ path = os.path.join('/usr/share/python-wheels', wheel_name) ++ with open(path, 'rb') as fp: ++ whl = fp.read() ++ dest = os.path.join(destdir, wheel_name) ++ with open(dest, 'wb') as fp: ++ fp.write(whl) ++ paths.append(dest) ++ ++ private_wheel_dir = os.path.join(sys.prefix, 'lib', 'python-wheels') ++ os.makedirs(private_wheel_dir, exist_ok=True) ++ for project in _PROJECTS: ++ try: ++ with open('/usr/share/python-wheels/%s.dependencies' % project) as fp: ++ dependencies = [line[:-1].split()[0] for line in fp.readlines()] ++ except FileNotFoundError: ++ dependencies = [] ++ copy_wheels(dependencies, private_wheel_dir, sys.path) ++ + with tempfile.TemporaryDirectory() as tmpdir: + # Put our bundled wheels into a temporary directory and construct the + # additional paths that need added to sys.path + additional_paths = [] +- for project, version in _PROJECTS: +- wheel_name = "{}-{}-py2.py3-none-any.whl".format(project, version) +- whl = pkgutil.get_data( +- "ensurepip", +- "_bundled/{}".format(wheel_name), +- ) +- with open(os.path.join(tmpdir, wheel_name), "wb") as fp: +- fp.write(whl) +- +- additional_paths.append(os.path.join(tmpdir, wheel_name)) ++ copy_wheels(_PROJECTS, tmpdir, additional_paths) + + # Construct the arguments to be passed to the pip command + args = ["install", "--no-index", "--find-links", tmpdir] +@@ -113,7 +134,7 @@ + if verbosity: + args += ["-" + "v" * verbosity] + +- _run_pip(args + [p[0] for p in _PROJECTS], additional_paths) ++ _run_pip(args + _PROJECTS, additional_paths) + + def _uninstall_helper(*, verbosity=0): + """Helper to support a clean default uninstall process on Windows +@@ -127,7 +148,8 @@ + return + + # If the pip version doesn't match the bundled one, leave it alone +- if pip.__version__ != _PIP_VERSION: ++ # Disabled for Debian, always using the version from the python3-pip package. ++ if False and pip.__version__ != _PIP_VERSION: + msg = ("ensurepip will only uninstall a matching version " + "({!r} installed, {!r} bundled)") + print(msg.format(pip.__version__, _PIP_VERSION), file=sys.stderr) +@@ -141,7 +163,7 @@ + if verbosity: + args += ["-" + "v" * verbosity] + +- _run_pip(args + [p[0] for p in reversed(_PROJECTS)]) ++ _run_pip(args + reversed(_PROJECTS)) + + + def _main(argv=None): --- python3.4-3.4.1.orig/debian/patches/ext-no-libpython-link.diff +++ python3.4-3.4.1/debian/patches/ext-no-libpython-link.diff @@ -0,0 +1,24 @@ +# DP: Don't link extensions with the shared libpython library. + +Index: b/Lib/distutils/command/build_ext.py +=================================================================== +--- a/Lib/distutils/command/build_ext.py ++++ b/Lib/distutils/command/build_ext.py +@@ -245,7 +245,7 @@ + # For building extensions with a shared Python library, + # Python's library directory must be appended to library_dirs + # See Issues: #1600860, #4366 +- if (sysconfig.get_config_var('Py_ENABLE_SHARED')): ++ if False and (sysconfig.get_config_var('Py_ENABLE_SHARED')): + if sys.executable.startswith(os.path.join(sys.exec_prefix, "bin")): + # building third party extensions + self.library_dirs.append(sysconfig.get_config_var('LIBDIR')) +@@ -735,7 +735,7 @@ + return ext.libraries + else: + from distutils import sysconfig +- if sysconfig.get_config_var('Py_ENABLE_SHARED'): ++ if False and sysconfig.get_config_var('Py_ENABLE_SHARED'): + pythonlib = 'python{}.{}{}'.format( + sys.hexversion >> 24, (sys.hexversion >> 16) & 0xff, + sys.abiflags) --- python3.4-3.4.1.orig/debian/patches/gdbm-import.diff +++ python3.4-3.4.1/debian/patches/gdbm-import.diff @@ -0,0 +1,12 @@ +# DP: suggest installation of python3-gdbm package on failing _gdbm import + +--- a/Lib/dbm/gnu.py ++++ b/Lib/dbm/gnu.py +@@ -1,3 +1,6 @@ + """Provide the _gdbm module as a dbm submodule.""" + +-from _gdbm import * ++try: ++ from _gdbm import * ++except ImportError as msg: ++ raise ImportError(str(msg) + ', please install the python3-gdbm package') --- python3.4-3.4.1.orig/debian/patches/hg-updates.diff +++ python3.4-3.4.1/debian/patches/hg-updates.diff @@ -0,0 +1,26938 @@ +# DP: updates from the 3.4 release branch (until 2014-07-26, 91873:01c6d2893092). + +# hg diff -r v3.4.1 | filterdiff --exclude=.hgignore --exclude=.hgeol --exclude=.hgtags --remove-timestamps + +diff -r c0e311e010fc Doc/Makefile +--- a/Doc/Makefile ++++ b/Doc/Makefile +@@ -69,24 +69,30 @@ + @echo "The overview file is in build/changes." + + linkcheck: BUILDER = linkcheck +-linkcheck: build +- @echo "Link check complete; look for any errors in the above output" \ +- "or in build/$(BUILDER)/output.txt" ++linkcheck: ++ @$(MAKE) build BUILDER=$(BUILDER) || { \ ++ echo "Link check complete; look for any errors in the above output" \ ++ "or in build/$(BUILDER)/output.txt"; \ ++ false; } + + suspicious: BUILDER = suspicious +-suspicious: build +- @echo "Suspicious check complete; look for any errors in the above output" \ +- "or in build/$(BUILDER)/suspicious.csv. If all issues are false" \ +- "positives, append that file to tools/sphinxext/susp-ignored.csv." ++suspicious: ++ @$(MAKE) build BUILDER=$(BUILDER) || { \ ++ echo "Suspicious check complete; look for any errors in the above output" \ ++ "or in build/$(BUILDER)/suspicious.csv. If all issues are false" \ ++ "positives, append that file to tools/sphinxext/susp-ignored.csv."; \ ++ false; } + + coverage: BUILDER = coverage + coverage: build + @echo "Coverage finished; see c.txt and python.txt in build/coverage" + + doctest: BUILDER = doctest +-doctest: build +- @echo "Testing of doctests in the sources finished, look at the" \ +- "results in build/doctest/output.txt" ++doctest: ++ @$(MAKE) build BUILDER=$(BUILDER) || { \ ++ echo "Testing of doctests in the sources finished, look at the" \ ++ "results in build/doctest/output.txt"; \ ++ false; } + + pydoc-topics: BUILDER = pydoc-topics + pydoc-topics: build +diff -r c0e311e010fc Doc/c-api/buffer.rst +--- a/Doc/c-api/buffer.rst ++++ b/Doc/c-api/buffer.rst +@@ -89,6 +89,16 @@ + + .. c:type:: Py_buffer + ++ .. c:member:: void \*buf ++ ++ A pointer to the start of the logical structure described by the buffer ++ fields. This can be any location within the underlying physical memory ++ block of the exporter. For example, with negative :c:member:`~Py_buffer.strides` ++ the value may point to the end of the memory block. ++ ++ For contiguous arrays, the value points to the beginning of the memory ++ block. ++ + .. c:member:: void \*obj + + A new reference to the exporting object. The reference is owned by +@@ -101,16 +111,6 @@ + this field is *NULL*. In general, exporting objects MUST NOT + use this scheme. + +- .. c:member:: void \*buf +- +- A pointer to the start of the logical structure described by the buffer +- fields. This can be any location within the underlying physical memory +- block of the exporter. For example, with negative :c:member:`~Py_buffer.strides` +- the value may point to the end of the memory block. +- +- For contiguous arrays, the value points to the beginning of the memory +- block. +- + .. c:member:: Py_ssize_t len + + ``product(shape) * itemsize``. For contiguous arrays, this is the length +@@ -489,8 +489,8 @@ + :c:member:`view->obj` to *NULL* and return -1; + + If this function is used as part of a :ref:`getbufferproc `, +- *exporter* MUST be set to the exporting object. Otherwise, *exporter* MUST +- be NULL. ++ *exporter* MUST be set to the exporting object and *flags* must be passed ++ unmodified. Otherwise, *exporter* MUST be NULL. + + + +diff -r c0e311e010fc Doc/c-api/typeobj.rst +--- a/Doc/c-api/typeobj.rst ++++ b/Doc/c-api/typeobj.rst +@@ -597,7 +597,9 @@ + .. c:member:: richcmpfunc PyTypeObject.tp_richcompare + + An optional pointer to the rich comparison function, whose signature is +- ``PyObject *tp_richcompare(PyObject *a, PyObject *b, int op)``. ++ ``PyObject *tp_richcompare(PyObject *a, PyObject *b, int op)``. The first ++ parameter is guaranteed to be an instance of the type that is defined ++ by :c:type:`PyTypeObject`. + + The function should return the result of the comparison (usually ``Py_True`` + or ``Py_False``). If the comparison is undefined, it must return +diff -r c0e311e010fc Doc/c-api/unicode.rst +--- a/Doc/c-api/unicode.rst ++++ b/Doc/c-api/unicode.rst +@@ -1624,7 +1624,7 @@ + Compare a unicode object, *uni*, with *string* and return -1, 0, 1 for less + than, equal, and greater than, respectively. It is best to pass only + ASCII-encoded strings, but the function interprets the input string as +- ISO-8859-1 if it contains non-ASCII characters". ++ ISO-8859-1 if it contains non-ASCII characters. + + + .. c:function:: PyObject* PyUnicode_RichCompare(PyObject *left, PyObject *right, int op) +@@ -1646,7 +1646,7 @@ + .. c:function:: PyObject* PyUnicode_Format(PyObject *format, PyObject *args) + + Return a new string object from *format* and *args*; this is analogous to +- ``format % args``. The *args* argument must be a tuple. ++ ``format % args``. + + + .. c:function:: int PyUnicode_Contains(PyObject *container, PyObject *element) +diff -r c0e311e010fc Doc/distributing/index.rst +--- a/Doc/distributing/index.rst ++++ b/Doc/distributing/index.rst +@@ -48,6 +48,18 @@ + standard library, but its name lives on in other ways (such as the name + of the mailing list used to coordinate Python packaging standards + development). ++* ``setuptools`` is a (largely) drop-in replacement for ``distutils`` first ++ published in 2004. Its most notable addition over the unmodified ++ ``distutils`` tools was the ability to declare dependencies on other ++ packages. It is currently recommended as a more regularly updated ++ alternative to ``distutils`` that offers consistent support for more ++ recent packaging standards across a wide range of Python versions. ++* ``wheel`` (in this context) is a project that adds the ``bdist_wheel`` ++ command to ``distutils``/``setuptools``. This produces a cross platform ++ binary packaging format (called "wheels" or "wheel files" and defined in ++ :pep:`427`) that allows Python libraries, even those including binary ++ extensions, to be installed on a system without needing to be built ++ locally. + + + Open source licensing and collaboration +@@ -85,12 +97,16 @@ + + pip install setuptools wheel twine + ++The Python Packaging User Guide includes more details on the `currently ++recommended tools`_. ++ ++.. _currently recommended tools: https://packaging.python.org/en/latest/current.html#packaging-tool-recommendations + + Reading the guide + ================= + + The Python Packaging User Guide covers the various key steps and elements +-involved in creating a project ++involved in creating a project: + + * `Project structure`_ + * `Building and packaging the project`_ +diff -r c0e311e010fc Doc/distutils/examples.rst +--- a/Doc/distutils/examples.rst ++++ b/Doc/distutils/examples.rst +@@ -193,9 +193,6 @@ + packages=['foobar', 'foobar.subfoo'], + ) + +-(Again, the empty string in :option:`package_dir` stands for the current +-directory.) +- + + .. _single-ext: + +diff -r c0e311e010fc Doc/faq/design.rst +--- a/Doc/faq/design.rst ++++ b/Doc/faq/design.rst +@@ -664,62 +664,6 @@ + sloppy and not write test cases at all. + + +-Why are default values shared between objects? +----------------------------------------------- +- +-This type of bug commonly bites neophyte programmers. Consider this function:: +- +- def foo(mydict={}): # Danger: shared reference to one dict for all calls +- ... compute something ... +- mydict[key] = value +- return mydict +- +-The first time you call this function, ``mydict`` contains a single item. The +-second time, ``mydict`` contains two items because when ``foo()`` begins +-executing, ``mydict`` starts out with an item already in it. +- +-It is often expected that a function call creates new objects for default +-values. This is not what happens. Default values are created exactly once, when +-the function is defined. If that object is changed, like the dictionary in this +-example, subsequent calls to the function will refer to this changed object. +- +-By definition, immutable objects such as numbers, strings, tuples, and ``None``, +-are safe from change. Changes to mutable objects such as dictionaries, lists, +-and class instances can lead to confusion. +- +-Because of this feature, it is good programming practice to not use mutable +-objects as default values. Instead, use ``None`` as the default value and +-inside the function, check if the parameter is ``None`` and create a new +-list/dictionary/whatever if it is. For example, don't write:: +- +- def foo(mydict={}): +- ... +- +-but:: +- +- def foo(mydict=None): +- if mydict is None: +- mydict = {} # create a new dict for local namespace +- +-This feature can be useful. When you have a function that's time-consuming to +-compute, a common technique is to cache the parameters and the resulting value +-of each call to the function, and return the cached value if the same value is +-requested again. This is called "memoizing", and can be implemented like this:: +- +- # Callers will never provide a third parameter for this function. +- def expensive(arg1, arg2, _cache={}): +- if (arg1, arg2) in _cache: +- return _cache[(arg1, arg2)] +- +- # Calculate the value +- result = ... expensive computation ... +- _cache[(arg1, arg2)] = result # Store result in the cache +- return result +- +-You could use a global variable containing a dictionary instead of the default +-value; it's a matter of taste. +- +- + Why is there no goto? + --------------------- + +diff -r c0e311e010fc Doc/faq/programming.rst +--- a/Doc/faq/programming.rst ++++ b/Doc/faq/programming.rst +@@ -352,6 +352,62 @@ + occur when the module is initialized. + + ++Why are default values shared between objects? ++---------------------------------------------- ++ ++This type of bug commonly bites neophyte programmers. Consider this function:: ++ ++ def foo(mydict={}): # Danger: shared reference to one dict for all calls ++ ... compute something ... ++ mydict[key] = value ++ return mydict ++ ++The first time you call this function, ``mydict`` contains a single item. The ++second time, ``mydict`` contains two items because when ``foo()`` begins ++executing, ``mydict`` starts out with an item already in it. ++ ++It is often expected that a function call creates new objects for default ++values. This is not what happens. Default values are created exactly once, when ++the function is defined. If that object is changed, like the dictionary in this ++example, subsequent calls to the function will refer to this changed object. ++ ++By definition, immutable objects such as numbers, strings, tuples, and ``None``, ++are safe from change. Changes to mutable objects such as dictionaries, lists, ++and class instances can lead to confusion. ++ ++Because of this feature, it is good programming practice to not use mutable ++objects as default values. Instead, use ``None`` as the default value and ++inside the function, check if the parameter is ``None`` and create a new ++list/dictionary/whatever if it is. For example, don't write:: ++ ++ def foo(mydict={}): ++ ... ++ ++but:: ++ ++ def foo(mydict=None): ++ if mydict is None: ++ mydict = {} # create a new dict for local namespace ++ ++This feature can be useful. When you have a function that's time-consuming to ++compute, a common technique is to cache the parameters and the resulting value ++of each call to the function, and return the cached value if the same value is ++requested again. This is called "memoizing", and can be implemented like this:: ++ ++ # Callers will never provide a third parameter for this function. ++ def expensive(arg1, arg2, _cache={}): ++ if (arg1, arg2) in _cache: ++ return _cache[(arg1, arg2)] ++ ++ # Calculate the value ++ result = ... expensive computation ... ++ _cache[(arg1, arg2)] = result # Store result in the cache ++ return result ++ ++You could use a global variable containing a dictionary instead of the default ++value; it's a matter of taste. ++ ++ + How can I pass optional or keyword parameters from one function to another? + --------------------------------------------------------------------------- + +diff -r c0e311e010fc Doc/howto/functional.rst +--- a/Doc/howto/functional.rst ++++ b/Doc/howto/functional.rst +@@ -583,7 +583,7 @@ + + Because ``yield`` will often be returning ``None``, you should always check for + this case. Don't just use its value in expressions unless you're sure that the +-:meth:`~generator.send` method will be the only method used resume your ++:meth:`~generator.send` method will be the only method used to resume your + generator function. + + In addition to :meth:`~generator.send`, there are two other methods on +diff -r c0e311e010fc Doc/howto/logging.rst +--- a/Doc/howto/logging.rst ++++ b/Doc/howto/logging.rst +@@ -1027,6 +1027,15 @@ + so that if the logger's threshold is set above ``DEBUG``, the calls to + :func:`expensive_func1` and :func:`expensive_func2` are never made. + ++.. note:: In some cases, :meth:`~Logger.isEnabledFor` can itself be more ++ expensive than you'd like (e.g. for deeply nested loggers where an explicit ++ level is only set high up in the logger hierarchy). In such cases (or if you ++ want to avoid calling a method in tight loops), you can cache the result of a ++ call to :meth:`~Logger.isEnabledFor` in a local or instance variable, and use ++ that instead of calling the method each time. Such a cached value would only ++ need to be recomputed when the logging configuration changes dynamically ++ while the application is running (which is not all that common). ++ + There are other optimizations which can be made for specific applications which + need more precise control over what logging information is collected. Here's a + list of things you can do to avoid processing during logging which you don't +@@ -1036,6 +1045,12 @@ + | What you don't want to collect | How to avoid collecting it | + +===============================================+========================================+ + | Information about where calls were made from. | Set ``logging._srcfile`` to ``None``. | ++| | This avoids calling | ++| | :func:`sys._getframe`, which may help | ++| | to speed up your code in environments | ++| | like PyPy (which can't speed up code | ++| | that uses :func:`sys._getframe`), if | ++| | and when PyPy supports Python 3.x. | + +-----------------------------------------------+----------------------------------------+ + | Threading information. | Set ``logging.logThreads`` to ``0``. | + +-----------------------------------------------+----------------------------------------+ +diff -r c0e311e010fc Doc/howto/sockets.rst +--- a/Doc/howto/sockets.rst ++++ b/Doc/howto/sockets.rst +@@ -180,7 +180,7 @@ + Assuming you don't want to end the connection, the simplest solution is a fixed + length message:: + +- class mysocket: ++ class MySocket: + """demonstration class only + - coded for clarity, not efficiency + """ +@@ -189,8 +189,8 @@ + if sock is None: + self.sock = socket.socket( + socket.AF_INET, socket.SOCK_STREAM) +- else: +- self.sock = sock ++ else: ++ self.sock = sock + + def connect(self, host, port): + self.sock.connect((host, port)) +@@ -204,13 +204,15 @@ + totalsent = totalsent + sent + + def myreceive(self): +- msg = b'' +- while len(msg) < MSGLEN: +- chunk = self.sock.recv(MSGLEN-len(msg)) ++ chunks = [] ++ bytes_recd = 0 ++ while bytes_recd < MSGLEN: ++ chunk = self.sock.recv(min(MSGLEN - bytes_recd, 2048)) + if chunk == b'': + raise RuntimeError("socket connection broken") +- msg = msg + chunk +- return msg ++ chunks.append(chunk) ++ bytes_recd = bytes_recd + len(chunk) ++ return b''.join(chunks) + + The sending code here is usable for almost any messaging scheme - in Python you + send strings, and you can use ``len()`` to determine its length (even if it has +diff -r c0e311e010fc Doc/howto/urllib2.rst +--- a/Doc/howto/urllib2.rst ++++ b/Doc/howto/urllib2.rst +@@ -97,7 +97,7 @@ + ---- + + Sometimes you want to send data to a URL (often the URL will refer to a CGI +-(Common Gateway Interface) script [#]_ or other web application). With HTTP, ++(Common Gateway Interface) script or other web application). With HTTP, + this is often done using what's known as a **POST** request. This is often what + your browser does when you submit a HTML form that you filled in on the web. Not + all POSTs have to come from forms: you can use a POST to transmit arbitrary data +@@ -572,8 +572,6 @@ + + This document was reviewed and revised by John Lee. + +-.. [#] For an introduction to the CGI protocol see +- `Writing Web Applications in Python `_. + .. [#] Like Google for example. The *proper* way to use google from a program + is to use `PyGoogle `_ of course. See + `Voidspace Google `_ +diff -r c0e311e010fc Doc/howto/webservers.rst +--- a/Doc/howto/webservers.rst ++++ b/Doc/howto/webservers.rst +@@ -687,7 +687,7 @@ + The newest version of TurboGears, version 2.0, moves even further in direction + of WSGI support and a component-based architecture. TurboGears 2 is based on + the WSGI stack of another popular component-based web framework, `Pylons +-`_. ++`_. + + + Zope +diff -r c0e311e010fc Doc/library/2to3.rst +--- a/Doc/library/2to3.rst ++++ b/Doc/library/2to3.rst +@@ -392,7 +392,7 @@ + Replaces use of the :class:`set` constructor with set literals. This fixer + is optional. + +-.. 2to3fixer:: standard_error ++.. 2to3fixer:: standarderror + + Renames :exc:`StandardError` to :exc:`Exception`. + +diff -r c0e311e010fc Doc/library/__main__.rst +--- a/Doc/library/__main__.rst ++++ b/Doc/library/__main__.rst +@@ -12,7 +12,7 @@ + A module can discover whether or not it is running in the main scope by + checking its own ``__name__``, which allows a common idiom for conditionally + executing code in a module when it is run as a script or with ``python +--m`` but not when it is imported: ++-m`` but not when it is imported:: + + if __name__ == "__main__": + # execute only if run as a script +diff -r c0e311e010fc Doc/library/_thread.rst +--- a/Doc/library/_thread.rst ++++ b/Doc/library/_thread.rst +@@ -176,10 +176,6 @@ + * Calling :func:`sys.exit` or raising the :exc:`SystemExit` exception is + equivalent to calling :func:`_thread.exit`. + +-* Not all built-in functions that may block waiting for I/O allow other threads +- to run. (The most popular ones (:func:`time.sleep`, :meth:`io.FileIO.read`, +- :func:`select.select`) work as expected.) +- + * It is not possible to interrupt the :meth:`acquire` method on a lock --- the + :exc:`KeyboardInterrupt` exception will happen after the lock has been acquired. + +diff -r c0e311e010fc Doc/library/asyncio-dev.rst +--- a/Doc/library/asyncio-dev.rst ++++ b/Doc/library/asyncio-dev.rst +@@ -9,6 +9,29 @@ + This page lists common traps and explains how to avoid them. + + ++.. _asyncio-debug-mode: ++ ++Debug mode of asyncio ++--------------------- ++ ++To enable the debug mode globally, set the environment variable ++:envvar:`PYTHONASYNCIODEBUG` to ``1``. Examples of effects of the debug mode: ++ ++* Log :ref:`coroutines defined but never "yielded from" ++ ` ++* :meth:`~BaseEventLoop.call_soon` and :meth:`~BaseEventLoop.call_at` methods ++ raise an exception if they are called from the wrong thread. ++* Log the execution time of the selector ++* Log callbacks taking more than 100 ms to be executed. The ++ :attr:`BaseEventLoop.slow_callback_duration` attribute is the minimum ++ duration in seconds of "slow" callbacks. ++ ++.. seealso:: ++ ++ The :meth:`BaseEventLoop.set_debug` method and the :ref:`asyncio logger ++ `. ++ ++ + .. _asyncio-multithreading: + + Concurrency and multithreading +@@ -80,20 +103,11 @@ + Detect coroutine objects never scheduled + ---------------------------------------- + +-When a coroutine function is called but not passed to :func:`async` or to the +-:class:`Task` constructor, it is not scheduled and it is probably a bug. +- +-To detect such bug, set the environment variable :envvar:`PYTHONASYNCIODEBUG` +-to ``1``. When the coroutine object is destroyed by the garbage collector, a +-log will be emitted with the traceback where the coroutine function was called. +-See the :ref:`asyncio logger `. +- +-The debug flag changes the behaviour of the :func:`coroutine` decorator. The +-debug flag value is only used when then coroutine function is defined, not when +-it is called. Coroutine functions defined before the debug flag is set to +-``True`` will not be tracked. For example, it is not possible to debug +-coroutines defined in the :mod:`asyncio` module, because the module must be +-imported before the flag value can be changed. ++When a coroutine function is called and its result is not passed to ++:func:`async` or to the :meth:`BaseEventLoop.create_task` method: the execution ++of the coroutine objet will never be scheduled and it is probably a bug. ++:ref:`Enable the debug mode of asyncio ` to :ref:`log a ++warning ` to detect it. + + Example with the bug:: + +@@ -107,20 +121,27 @@ + + Output in debug mode:: + +- Coroutine 'test' defined at test.py:4 was never yielded from ++ Coroutine test() at test.py:3 was never yielded from ++ Coroutine object created at (most recent call last): ++ File "test.py", line 7, in ++ test() + +-The fix is to call the :func:`async` function or create a :class:`Task` object +-with this coroutine object. ++The fix is to call the :func:`async` function or the ++:meth:`BaseEventLoop.create_task` method with the coroutine object. + ++.. seealso:: + +-Detect exceptions not consumed +------------------------------- ++ :ref:`Pending task destroyed `. ++ ++ ++Detect exceptions never consumed ++-------------------------------- + + Python usually calls :func:`sys.displayhook` on unhandled exceptions. If +-:meth:`Future.set_exception` is called, but the exception is not consumed, +-:func:`sys.displayhook` is not called. Instead, a log is emitted when the +-future is deleted by the garbage collector, with the traceback where the +-exception was raised. See the :ref:`asyncio logger `. ++:meth:`Future.set_exception` is called, but the exception is never consumed, ++:func:`sys.displayhook` is not called. Instead, a :ref:`a log is emitted ++` when the future is deleted by the garbage collector, with the ++traceback where the exception was raised. + + Example of unhandled exception:: + +@@ -136,16 +157,27 @@ + + Output:: + +- Future/Task exception was never retrieved: ++ Task exception was never retrieved ++ future: ++ source_traceback: Object created at (most recent call last): ++ File "test.py", line 10, in ++ asyncio.async(bug()) ++ File "asyncio/tasks.py", line 510, in async ++ task = loop.create_task(coro_or_future) + Traceback (most recent call last): +- File "/usr/lib/python3.4/asyncio/tasks.py", line 279, in _step ++ File "asyncio/tasks.py", line 244, in _step + result = next(coro) +- File "/usr/lib/python3.4/asyncio/tasks.py", line 80, in coro ++ File "coroutines.py", line 78, in __next__ ++ return next(self.gen) ++ File "asyncio/coroutines.py", line 141, in coro + res = func(*args, **kw) +- File "test.py", line 5, in bug ++ File "test.py", line 7, in bug + raise Exception("not consumed") + Exception: not consumed + ++:ref:`Enable the debug mode of asyncio ` to get the ++traceback where the task was created. ++ + There are different options to fix this issue. The first option is to chain to + coroutine in another coroutine and use classic try/except:: + +@@ -172,7 +204,7 @@ + See also the :meth:`Future.exception` method. + + +-Chain coroutines correctly ++Chain correctly coroutines + -------------------------- + + When a coroutine function calls other coroutine functions and tasks, they +@@ -223,7 +255,9 @@ + + (3) close file + (2) write into file +- Pending tasks at exit: {Task()} ++ Pending tasks at exit: {>} ++ Task was destroyed but it is pending! ++ task: > + + The loop stopped before the ``create()`` finished, ``close()`` has been called + before ``write()``, whereas coroutine functions were called in this order: +@@ -249,3 +283,29 @@ + yield from asyncio.sleep(2.0) + loop.stop() + ++ ++.. _asyncio-pending-task-destroyed: ++ ++Pending task destroyed ++---------------------- ++ ++If a pending task is destroyed, the execution of its wrapped :ref:`coroutine ++` did not complete. It is probably a bug and so a warning is logged. ++ ++Example of log:: ++ ++ Task was destroyed but it is pending! ++ source_traceback: Object created at (most recent call last): ++ File "test.py", line 17, in ++ task = asyncio.async(coro, loop=loop) ++ File "asyncio/tasks.py", line 510, in async ++ task = loop.create_task(coro_or_future) ++ task: > ++ ++:ref:`Enable the debug mode of asyncio ` to get the ++traceback where the task was created. ++ ++.. seealso:: ++ ++ :ref:`Detect coroutine objects never scheduled `. ++ +diff -r c0e311e010fc Doc/library/asyncio-eventloop.rst +--- a/Doc/library/asyncio-eventloop.rst ++++ b/Doc/library/asyncio-eventloop.rst +@@ -2,8 +2,8 @@ + + .. _asyncio-event-loop: + +-Event loops +-=========== ++Base Event Loop ++=============== + + The event loop is the central execution device provided by :mod:`asyncio`. + It provides multiple facilities, amongst which: +@@ -18,78 +18,9 @@ + + * Delegating costly function calls to a pool of threads. + +-Event loop policies and the default policy +------------------------------------------- ++.. class:: BaseEventLoop + +-Event loop management is abstracted with a *policy* pattern, to provide maximal +-flexibility for custom platforms and frameworks. Throughout the execution of a +-process, a single global policy object manages the event loops available to the +-process based on the calling context. A policy is an object implementing the +-:class:`AbstractEventLoopPolicy` interface. +- +-For most users of :mod:`asyncio`, policies never have to be dealt with +-explicitly, since the default global policy is sufficient. +- +-The default policy defines context as the current thread, and manages an event +-loop per thread that interacts with :mod:`asyncio`. The module-level functions +-:func:`get_event_loop` and :func:`set_event_loop` provide convenient access to +-event loops managed by the default policy. +- +-Event loop functions +--------------------- +- +-The following functions are convenient shortcuts to accessing the methods of the +-global policy. Note that this provides access to the default policy, unless an +-alternative policy was set by calling :func:`set_event_loop_policy` earlier in +-the execution of the process. +- +-.. function:: get_event_loop() +- +- Equivalent to calling ``get_event_loop_policy().get_event_loop()``. +- +-.. function:: set_event_loop(loop) +- +- Equivalent to calling ``get_event_loop_policy().set_event_loop(loop)``. +- +-.. function:: new_event_loop() +- +- Equivalent to calling ``get_event_loop_policy().new_event_loop()``. +- +-Event loop policy interface +---------------------------- +- +-An event loop policy must implement the following interface: +- +-.. class:: AbstractEventLoopPolicy +- +- .. method:: get_event_loop() +- +- Get the event loop for current context. Returns an event loop object +- implementing :class:`BaseEventLoop` interface, or raises an exception in case +- no event loop has been set for the current context and the current policy +- does not specify to create one. It should never return ``None``. +- +- .. method:: set_event_loop(loop) +- +- Set the event loop of the current context to *loop*. +- +- .. method:: new_event_loop() +- +- Create and return a new event loop object according to this policy's rules. +- If there's need to set this loop as the event loop of the current context, +- :meth:`set_event_loop` must be called explicitly. +- +-Access to the global loop policy +--------------------------------- +- +-.. function:: get_event_loop_policy() +- +- Get the current event loop policy. +- +-.. function:: set_event_loop_policy(policy) +- +- Set the current event loop policy. If *policy* is ``None``, the default +- policy is restored. ++ Base class of event loops. + + Run an event loop + ----------------- +@@ -102,8 +33,8 @@ + + Run until the :class:`Future` is done. + +- If the argument is a :ref:`coroutine `, it is wrapped +- in a :class:`Task`. ++ If the argument is a :ref:`coroutine object `, it is wrapped by ++ :func:`async`. + + Return the Future's result, or raise its exception. + +@@ -116,12 +47,19 @@ + Stop running the event loop. + + Every callback scheduled before :meth:`stop` is called will run. +- Callback scheduled after :meth:`stop` is called won't. However, those +- callbacks will run if :meth:`run_forever` is called again later. ++ Callbacks scheduled after :meth:`stop` is called will not run. ++ However, those callbacks will run if :meth:`run_forever` is called ++ again later. ++ ++.. method:: BaseEventLoop.is_closed() ++ ++ Returns ``True`` if the event loop was closed. ++ ++ .. versionadded:: 3.4.2 + + .. method:: BaseEventLoop.close() + +- Close the event loop. The loop should not be running. ++ Close the event loop. The loop must not be running. + + This clears the queues and shuts down the executor, but does not wait for + the executor to finish. +@@ -197,6 +135,25 @@ + The :func:`asyncio.sleep` function. + + ++Coroutines ++---------- ++ ++.. method:: BaseEventLoop.create_task(coro) ++ ++ Schedule the execution of a :ref:`coroutine object `: wrap it in ++ a future. Return a :class:`Task` object. ++ ++ Third-party event loops can use their own subclass of :class:`Task` for ++ interoperability. In this case, the result type is a subclass of ++ :class:`Task`. ++ ++ .. seealso:: ++ ++ The :meth:`async` function. ++ ++ .. versionadded:: 3.4.2 ++ ++ + Creating connections + -------------------- + +@@ -305,11 +262,11 @@ + + .. method:: BaseEventLoop.create_server(protocol_factory, host=None, port=None, \*, family=socket.AF_UNSPEC, flags=socket.AI_PASSIVE, sock=None, backlog=100, ssl=None, reuse_address=None) + +- A :ref:`coroutine ` method which creates a TCP server bound to +- host and port. ++ Create a TCP server bound to host and port. Return a :class:`Server` object, ++ its :attr:`~Server.sockets` attribute contains created sockets. Use the ++ :meth:`Server.close` method to stop the server: close listening sockets. + +- The return value is a :class:`AbstractServer` object which can be used to stop +- the service. ++ This method is a :ref:`coroutine `. + + If *host* is an empty string or None all interfaces are assumed + and a list of multiple sockets will be returned (most likely +@@ -349,7 +306,6 @@ + Availability: UNIX. + + +- + Watch file descriptors + ---------------------- + +@@ -455,11 +411,11 @@ + + .. method:: BaseEventLoop.connect_read_pipe(protocol_factory, pipe) + +- Register read pipe in eventloop. ++ Register read pipe in eventloop. Set the *pipe* to non-blocking mode. + + *protocol_factory* should instantiate object with :class:`Protocol` +- interface. pipe is file-like object already switched to nonblocking. +- Return pair (transport, protocol), where transport support ++ interface. *pipe* is a :term:`file-like object `. ++ Return pair ``(transport, protocol)``, where *transport* supports the + :class:`ReadTransport` interface. + + This method is a :ref:`coroutine `. +@@ -580,31 +536,55 @@ + + .. method:: BaseEventLoop.get_debug() + +- Get the debug mode (:class:`bool`) of the event loop, ``False`` by default. ++ Get the debug mode (:class:`bool`) of the event loop. ++ ++ The default value is ``True`` if the environment variable ++ :envvar:`PYTHONASYNCIODEBUG` is set to a non-empty string, ``False`` ++ otherwise. ++ ++ .. versionadded:: 3.4.2 + + .. method:: BaseEventLoop.set_debug(enabled: bool) + + Set the debug mode of the event loop. + ++ .. versionadded:: 3.4.2 ++ + .. seealso:: + +- The :ref:`Develop with asyncio ` section. +- ++ The :ref:`debug mode of asyncio `. + + Server + ------ + +-.. class:: AbstractServer ++.. class:: Server + +- Abstract server returned by :func:`BaseEventLoop.create_server`. ++ Server listening on sockets. ++ ++ Object created by the :meth:`BaseEventLoop.create_server` method and the ++ :func:`start_server` function. Don't instanciate the class directly. + + .. method:: close() + +- Stop serving. This leaves existing connections open. ++ Stop serving: close listening sockets and set the :attr:`sockets` ++ attribute to ``None``. ++ ++ The sockets that represent existing incoming client connections are ++ leaved open. ++ ++ The server is closed asynchonously, use the :meth:`wait_closed` coroutine ++ to wait until the server is closed. + + .. method:: wait_closed() + +- A :ref:`coroutine ` to wait until service is closed. ++ Wait until the :meth:`close` method completes. ++ ++ This method is a :ref:`coroutine `. ++ ++ .. attribute:: sockets ++ ++ List of :class:`socket.socket` objects the server is listening to, or ++ ``None`` if the server is closed. + + + Handle +@@ -618,7 +598,8 @@ + + .. method:: cancel() + +- Cancel the call. ++ Cancel the call. ++ + + + .. _asyncio-hello-world-callback: +@@ -636,7 +617,10 @@ + + loop = asyncio.get_event_loop() + loop.call_soon(print_and_repeat, loop) +- loop.run_forever() ++ try: ++ loop.run_forever() ++ finally: ++ loop.close() + + .. seealso:: + +@@ -664,5 +648,8 @@ + + print("Event loop running forever, press CTRL+c to interrupt.") + print("pid %s: send SIGINT or SIGTERM to exit." % os.getpid()) +- loop.run_forever() ++ try: ++ loop.run_forever() ++ finally: ++ loop.close() + +diff -r c0e311e010fc Doc/library/asyncio-eventloops.rst +--- /dev/null ++++ b/Doc/library/asyncio-eventloops.rst +@@ -0,0 +1,186 @@ ++.. currentmodule:: asyncio ++ ++Event loops ++=========== ++ ++Event loop functions ++-------------------- ++ ++The following functions are convenient shortcuts to accessing the methods of the ++global policy. Note that this provides access to the default policy, unless an ++alternative policy was set by calling :func:`set_event_loop_policy` earlier in ++the execution of the process. ++ ++.. function:: get_event_loop() ++ ++ Equivalent to calling ``get_event_loop_policy().get_event_loop()``. ++ ++.. function:: set_event_loop(loop) ++ ++ Equivalent to calling ``get_event_loop_policy().set_event_loop(loop)``. ++ ++.. function:: new_event_loop() ++ ++ Equivalent to calling ``get_event_loop_policy().new_event_loop()``. ++ ++ ++.. _asyncio-event-loops: ++ ++Available event loops ++--------------------- ++ ++asyncio currently provides two implementations of event loops: ++:class:`SelectorEventLoop` and :class:`ProactorEventLoop`. ++ ++.. class:: SelectorEventLoop ++ ++ Event loop based on the :mod:`selectors` module. Subclass of ++ :class:`BaseEventLoop`. ++ ++ Use the most efficient selector available on the platform. ++ ++.. class:: ProactorEventLoop ++ ++ Proactor event loop for Windows using "I/O Completion Ports" aka IOCP. ++ Subclass of :class:`BaseEventLoop`. ++ ++ Availability: Windows. ++ ++ .. seealso:: ++ ++ `MSDN documentation on I/O Completion Ports ++ `_. ++ ++Example to use a :class:`ProactorEventLoop` on Windows:: ++ ++ import asyncio, os ++ ++ if os.name == 'nt': ++ loop = asyncio.ProactorEventLoop() ++ asyncio.set_event_loop(loop) ++ ++.. _asyncio-platform-support: ++ ++Platform support ++---------------- ++ ++The :mod:`asyncio` module has been designed to be portable, but each platform ++still has subtle differences and may not support all :mod:`asyncio` features. ++ ++Windows ++^^^^^^^ ++ ++Common limits of Windows event loops: ++ ++- :meth:`~BaseEventLoop.create_unix_server` and ++ :meth:`~BaseEventLoop.create_unix_server` are not supported: the socket ++ family :data:`socket.AF_UNIX` is specific to UNIX ++- :meth:`~BaseEventLoop.add_signal_handler` and ++ :meth:`~BaseEventLoop.remove_signal_handler` are not supported ++- :meth:`EventLoopPolicy.set_child_watcher` is not supported. ++ :class:`ProactorEventLoop` supports subprocesses. It has only one ++ implementation to watch child processes, there is no need to configure it. ++ ++:class:`SelectorEventLoop` specific limits: ++ ++- :class:`~selectors.SelectSelector` is used but it only supports sockets, ++ see the `MSDN documentation of select ++ `_ ++- :meth:`~BaseEventLoop.add_reader` and :meth:`~BaseEventLoop.add_writer` only ++ accept file descriptors of sockets ++- Pipes are not supported ++ (ex: :meth:`~BaseEventLoop.connect_read_pipe`, ++ :meth:`~BaseEventLoop.connect_write_pipe`) ++- :ref:`Subprocesses ` are not supported ++ (ex: :meth:`~BaseEventLoop.subprocess_exec`, ++ :meth:`~BaseEventLoop.subprocess_shell`) ++ ++:class:`ProactorEventLoop` specific limits: ++ ++- SSL is not supported: :meth:`~BaseEventLoop.create_connection` and ++ :meth:`~BaseEventLoop.create_server` cannot be used with SSL for example ++- :meth:`~BaseEventLoop.create_datagram_endpoint` (UDP) is not supported ++- :meth:`~BaseEventLoop.add_reader` and :meth:`~BaseEventLoop.add_writer` are ++ not supported ++ ++The resolution of the monotonic clock on Windows is usually around 15.6 msec. ++The best resolution is 0.5 msec. The resolution depends on the hardware ++(availability of `HPET ++`_) and on the Windows ++configuration. See :ref:`asyncio delayed calls `. ++ ++ ++Mac OS X ++^^^^^^^^ ++ ++Character devices like PTY are only well supported since Mavericks (Mac OS ++10.9). They are not supported at all on Mac OS 10.5 and older. ++ ++On Mac OS 10.6, 10.7 and 10.8, the default event loop is ++:class:`SelectorEventLoop` which uses :class:`selectors.KqueueSelector`. ++:class:`selectors.KqueueSelector` does not support character devices on these ++versions. The :class:`SelectorEventLoop` can be used with ++:class:`~selectors.SelectSelector` or :class:`~selectors.PollSelector` to ++support character devices on these versions of Mac OS X. Example:: ++ ++ import asyncio ++ import selectors ++ ++ selector = selectors.SelectSelector() ++ loop = asyncio.SelectorEventLoop(selector) ++ asyncio.set_event_loop(loop) ++ ++ ++Event loop policies and the default policy ++------------------------------------------ ++ ++Event loop management is abstracted with a *policy* pattern, to provide maximal ++flexibility for custom platforms and frameworks. Throughout the execution of a ++process, a single global policy object manages the event loops available to the ++process based on the calling context. A policy is an object implementing the ++:class:`AbstractEventLoopPolicy` interface. ++ ++For most users of :mod:`asyncio`, policies never have to be dealt with ++explicitly, since the default global policy is sufficient. ++ ++The default policy defines context as the current thread, and manages an event ++loop per thread that interacts with :mod:`asyncio`. The module-level functions ++:func:`get_event_loop` and :func:`set_event_loop` provide convenient access to ++event loops managed by the default policy. ++ ++Event loop policy interface ++--------------------------- ++ ++An event loop policy must implement the following interface: ++ ++.. class:: AbstractEventLoopPolicy ++ ++ .. method:: get_event_loop() ++ ++ Get the event loop for the current context. Returns an event loop object ++ implementing the :class:`BaseEventLoop` interface, or raises an exception in case ++ no event loop has been set for the current context and the current policy ++ does not specify to create one. It should never return ``None``. ++ ++ .. method:: set_event_loop(loop) ++ ++ Set the event loop for the current context to *loop*. ++ ++ .. method:: new_event_loop() ++ ++ Create and return a new event loop object according to this policy's rules. ++ If there's need to set this loop as the event loop for the current context, ++ :meth:`set_event_loop` must be called explicitly. ++ ++Access to the global loop policy ++-------------------------------- ++ ++.. function:: get_event_loop_policy() ++ ++ Get the current event loop policy. ++ ++.. function:: set_event_loop_policy(policy) ++ ++ Set the current event loop policy. If *policy* is ``None``, the default ++ policy is restored. ++ +diff -r c0e311e010fc Doc/library/asyncio-protocol.rst +--- a/Doc/library/asyncio-protocol.rst ++++ b/Doc/library/asyncio-protocol.rst +@@ -272,8 +272,8 @@ + Connection callbacks + -------------------- + +-These callbacks may be called on :class:`Protocol` and +-:class:`SubprocessProtocol` instances: ++These callbacks may be called on :class:`Protocol`, :class:`DatagramProtocol` ++and :class:`SubprocessProtocol` instances: + + .. method:: BaseProtocol.connection_made(transport) + +@@ -291,10 +291,10 @@ + The latter means a regular EOF is received, or the connection was + aborted or closed by this side of the connection. + +-:meth:`connection_made` and :meth:`connection_lost` are called exactly once +-per successful connection. All other callbacks will be called between those +-two methods, which allows for easier resource management in your protocol +-implementation. ++:meth:`~BaseProtocol.connection_made` and :meth:`~BaseProtocol.connection_lost` ++are called exactly once per successful connection. All other callbacks will be ++called between those two methods, which allows for easier resource management ++in your protocol implementation. + + The following callbacks may be called only on :class:`SubprocessProtocol` + instances: +@@ -459,7 +459,7 @@ + example to raise an exception if the server is not listening, instead of + having to write a short coroutine to handle the exception and stop the + running loop. At :meth:`~BaseEventLoop.run_until_complete` exit, the loop is +-no more running, so there is no need to stop the loop in case of an error. ++no longer running, so there is no need to stop the loop in case of an error. + + Echo server + ----------- +diff -r c0e311e010fc Doc/library/asyncio-stream.rst +--- a/Doc/library/asyncio-stream.rst ++++ b/Doc/library/asyncio-stream.rst +@@ -34,28 +34,26 @@ + + .. function:: start_server(client_connected_cb, host=None, port=None, \*, loop=None, limit=None, **kwds) + +- Start a socket server, with a callback for each client connected. ++ Start a socket server, with a callback for each client connected. The return ++ value is the same as :meth:`~BaseEventLoop.create_server()`. + +- The first parameter, *client_connected_cb*, takes two parameters: ++ The *client_connected_cb* parameter is called with two parameters: + *client_reader*, *client_writer*. *client_reader* is a + :class:`StreamReader` object, while *client_writer* is a +- :class:`StreamWriter` object. This parameter can either be a plain callback +- function or a :ref:`coroutine function `; if it is a coroutine +- function, it will be automatically converted into a :class:`Task`. ++ :class:`StreamWriter` object. The *client_connected_cb* parameter can ++ either be a plain callback function or a :ref:`coroutine function ++ `; if it is a coroutine function, it will be automatically ++ wrapped in a future using the :meth:`BaseEventLoop.create_task` method. + + The rest of the arguments are all the usual arguments to + :meth:`~BaseEventLoop.create_server()` except *protocol_factory*; most +- common are positional host and port, with various optional keyword arguments +- following. The return value is the same as +- :meth:`~BaseEventLoop.create_server()`. ++ common are positional *host* and *port*, with various optional keyword ++ arguments following. + + Additional optional keyword arguments are *loop* (to set the event loop + instance to use) and *limit* (to set the buffer limit passed to the + :class:`StreamReader`). + +- The return value is the same as :meth:`~BaseEventLoop.create_server()`, i.e. +- a :class:`AbstractServer` object which can be used to stop the service. +- + This function is a :ref:`coroutine `. + + .. function:: open_unix_connection(path=None, \*, loop=None, limit=None, **kwds) +diff -r c0e311e010fc Doc/library/asyncio-subprocess.rst +--- a/Doc/library/asyncio-subprocess.rst ++++ b/Doc/library/asyncio-subprocess.rst +@@ -1,20 +1,27 @@ + .. currentmodule:: asyncio + ++.. _asyncio-subprocess: ++ + Subprocess + ========== + +-Operating system support +------------------------- ++Windows event loop ++------------------ + +-On Windows, the default event loop uses :class:`selectors.SelectSelector` +-which only supports sockets. The :class:`ProactorEventLoop` should be used to +-support subprocesses. However, the latter does not support SSL. ++On Windows, the default event loop is :class:`SelectorEventLoop` which does not ++support subprocesses. :class:`ProactorEventLoop` should be used instead. ++Example to use it on Windows:: + +-On Mac OS X older than 10.9 (Mavericks), :class:`selectors.KqueueSelector` +-does not support character devices like PTY, whereas it is used by the +-default event loop. The :class:`SelectorEventLoop` can be used with +-:class:`SelectSelector` or :class:`PollSelector` to handle character +-devices on Mac OS X 10.6 (Snow Leopard) and later. ++ import asyncio, os ++ ++ if os.name == 'nt': ++ loop = asyncio.ProactorEventLoop() ++ asyncio.set_event_loop(loop) ++ ++.. seealso:: ++ ++ :ref:`Available event loops ` and :ref:`Platform ++ support `. + + + Create a subprocess: high-level API using Process +@@ -22,8 +29,8 @@ + + .. function:: create_subprocess_shell(cmd, stdin=None, stdout=None, stderr=None, loop=None, limit=None, \*\*kwds) + +- Run the shell command *cmd* given as a string. Return a :class:`~asyncio.subprocess.Process` +- instance. ++ Run the shell command *cmd*. See :meth:`BaseEventLoop.subprocess_shell` for ++ parameters. Return a :class:`~asyncio.subprocess.Process` instance. + + The optional *limit* parameter sets the buffer limit passed to the + :class:`StreamReader`. +@@ -32,7 +39,8 @@ + + .. function:: create_subprocess_exec(\*args, stdin=None, stdout=None, stderr=None, loop=None, limit=None, \*\*kwds) + +- Create a subprocess. Return a :class:`~asyncio.subprocess.Process` instance. ++ Create a subprocess. See :meth:`BaseEventLoop.subprocess_exec` for ++ parameters. Return a :class:`~asyncio.subprocess.Process` instance. + + The optional *limit* parameter sets the buffer limit passed to the + :class:`StreamReader`. +@@ -50,7 +58,9 @@ + + .. method:: BaseEventLoop.subprocess_exec(protocol_factory, \*args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, \*\*kwargs) + +- Create a subprocess from one or more string arguments, where the first string ++ Create a subprocess from one or more string arguments (character strings or ++ bytes strings encoded to the :ref:`filesystem encoding ++ `), where the first string + specifies the program to execute, and the remaining strings specify the + program's arguments. (Thus, together the string arguments form the + ``sys.argv`` value of the program, assuming it is a Python script.) This is +@@ -94,8 +104,9 @@ + + .. method:: BaseEventLoop.subprocess_shell(protocol_factory, cmd, \*, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, \*\*kwargs) + +- Create a subprocess from *cmd*, which is a string using the platform's +- "shell" syntax. This is similar to the standard library ++ Create a subprocess from *cmd*, which is a character string or a bytes ++ string encoded to the :ref:`filesystem encoding `, ++ using the platform's "shell" syntax. This is similar to the standard library + :class:`subprocess.Popen` class called with ``shell=True``. + + See :meth:`~BaseEventLoop.subprocess_exec` for more details about +@@ -180,6 +191,10 @@ + process, or ``None``, if no data should be sent to the child. The type + of *input* must be bytes. + ++ If a :exc:`BrokenPipeError` or :exc:`ConnectionResetError` exception is ++ raised when writing *input* into stdin, the exception is ignored. It ++ occurs when the process exits before all data are written into stdin. ++ + :meth:`communicate` returns a tuple ``(stdoutdata, stderrdata)``. + + Note that if you want to send data to the process's stdin, you need to +@@ -194,6 +209,10 @@ + + This method is a :ref:`coroutine `. + ++ .. versionchanged:: 3.4.2 ++ The method now ignores :exc:`BrokenPipeError` and ++ :exc:`ConnectionResetError`. ++ + .. method:: kill() + + Kills the child. On Posix OSs the function sends :py:data:`SIGKILL` to +diff -r c0e311e010fc Doc/library/asyncio-task.rst +--- a/Doc/library/asyncio-task.rst ++++ b/Doc/library/asyncio-task.rst +@@ -51,8 +51,8 @@ + generator object, which doesn't do anything until you iterate over it. + In the case of a coroutine object, there are two basic ways to start + it running: call ``yield from coroutine`` from another coroutine +-(assuming the other coroutine is already running!), or convert it to a +-:class:`Task`. ++(assuming the other coroutine is already running!), or schedule its execution ++using the :meth:`BaseEventLoop.create_task` method. + + Coroutines (and tasks) can only run when the event loop is running. + +@@ -89,7 +89,10 @@ + yield from asyncio.sleep(2) + + loop = asyncio.get_event_loop() +- loop.run_until_complete(greet_every_two_seconds()) ++ try: ++ loop.run_until_complete(greet_every_two_seconds()) ++ finally: ++ loop.close() + + .. seealso:: + +@@ -142,6 +145,18 @@ + The operation is not allowed in this state. + + ++TimeoutError ++------------ ++ ++.. exception:: TimeoutError ++ ++ The operation exceeded the given deadline. ++ ++.. note:: ++ ++ This exception is different from the builtin :exc:`TimeoutError` exception! ++ ++ + Future + ------ + +@@ -241,12 +256,12 @@ + + loop = asyncio.get_event_loop() + future = asyncio.Future() +- asyncio.Task(slow_operation(future)) ++ loop.create_task(slow_operation(future)) + loop.run_until_complete(future) + print(future.result()) + loop.close() + +-The coroutine function is responsible of the computation (which takes 1 second) ++The coroutine function is responsible for the computation (which takes 1 second) + and it stores the result into the future. The + :meth:`~BaseEventLoop.run_until_complete` method waits for the completion of + the future. +@@ -277,7 +292,7 @@ + + loop = asyncio.get_event_loop() + future = asyncio.Future() +- asyncio.Task(slow_operation(future)) ++ loop.create_task(slow_operation(future)) + future.add_done_callback(got_result) + try: + loop.run_forever() +@@ -299,7 +314,33 @@ + + .. class:: Task(coro, \*, loop=None) + +- A coroutine object wrapped in a :class:`Future`. Subclass of :class:`Future`. ++ Schedule the execution of a :ref:`coroutine `: wrap it in a ++ future. A task is a subclass of :class:`Future`. ++ ++ A task is responsible to execute a coroutine object in an event loop. If ++ the wrapped coroutine yields from a future, the task suspends the execution ++ of the wrapped coroutine and waits for the completition of the future. When ++ the future is done, the execution of the wrapped coroutine restarts with the ++ result or the exception of the future. ++ ++ Event loops use cooperative scheduling: an event loop only runs one task at ++ the same time. Other tasks may run in parallel if other event loops are ++ running in different threads. While a task waits for the completion of a ++ future, the event loop executes a new task. ++ ++ The cancellation of a task is different than cancelling a future. Calling ++ :meth:`cancel` will throw a :exc:`~concurrent.futures.CancelledError` to the ++ wrapped coroutine. :meth:`~Future.cancelled` only returns ``True`` if the ++ wrapped coroutine did not catch the ++ :exc:`~concurrent.futures.CancelledError` exception, or raised a ++ :exc:`~concurrent.futures.CancelledError` exception. ++ ++ If a pending task is destroyed, the execution of its wrapped :ref:`coroutine ++ ` did not complete. It is probably a bug and a warning is ++ logged: see :ref:`Pending task destroyed `. ++ ++ Don't create directly :class:`Task` instances: use the ++ :meth:`BaseEventLoop.create_task` method. + + .. classmethod:: all_tasks(loop=None) + +@@ -315,7 +356,27 @@ + + ``None`` is returned when called not in the context of a :class:`Task`. + +- .. method:: get_stack(self, \*, limit=None) ++ .. method:: cancel() ++ ++ Request this task to cancel itself. ++ ++ This arranges for a :exc:`~concurrent.futures.CancelledError` to be ++ thrown into the wrapped coroutine on the next cycle through the event ++ loop. The coroutine then has a chance to clean up or even deny the ++ request using try/except/finally. ++ ++ Contrary to :meth:`Future.cancel`, this does not guarantee that the task ++ will be cancelled: the exception might be caught and acted upon, delaying ++ cancellation of the task or preventing it completely. The task may also ++ return a value or raise a different exception. ++ ++ Immediately after this method is called, :meth:`~Future.cancelled` will ++ not return ``True`` (unless the task was already cancelled). A task will ++ be marked as cancelled when the wrapped coroutine terminates with a ++ :exc:`~concurrent.futures.CancelledError` exception (even if ++ :meth:`cancel` was not called). ++ ++ .. method:: get_stack(\*, limit=None) + + Return the list of stack frames for this task's coroutine. + +@@ -361,12 +422,11 @@ + f *= i + print("Task %s: factorial(%s) = %s" % (name, number, f)) + ++ loop = asyncio.get_event_loop() + tasks = [ +- asyncio.Task(factorial("A", 2)), +- asyncio.Task(factorial("B", 3)), +- asyncio.Task(factorial("C", 4))] +- +- loop = asyncio.get_event_loop() ++ loop.create_task(factorial("A", 2)), ++ loop.create_task(factorial("B", 3)), ++ loop.create_task(factorial("C", 4))] + loop.run_until_complete(asyncio.wait(tasks)) + loop.close() + +@@ -400,7 +460,8 @@ + Return an iterator whose values, when waited for, are :class:`Future` + instances. + +- Raises :exc:`TimeoutError` if the timeout occurs before all Futures are done. ++ Raises :exc:`asyncio.TimeoutError` if the timeout occurs before all Futures ++ are done. + + Example:: + +@@ -414,7 +475,8 @@ + + .. function:: async(coro_or_future, \*, loop=None) + +- Wrap a :ref:`coroutine object ` in a future. ++ Wrap a :ref:`coroutine object ` in a future using the ++ :meth:`BaseEventLoop.create_task` method. + + If the argument is a :class:`Future`, it is returned directly. + +@@ -488,6 +550,8 @@ + to complete. Coroutines will be wrapped in Tasks. Returns two sets of + :class:`Future`: (done, pending). + ++ The sequence *futures* must not be empty. ++ + *timeout* can be used to control the maximum number of seconds to wait before + returning. *timeout* can be an int or float. If *timeout* is not specified + or ``None``, there is no limit to the wait time. +@@ -521,25 +585,24 @@ + + .. note:: + +- This does not raise :exc:`TimeoutError`! Futures that aren't done when +- the timeout occurs are returned in the second set. ++ This does not raise :exc:`asyncio.TimeoutError`! Futures that aren't done ++ when the timeout occurs are returned in the second set. + + + .. function:: wait_for(fut, timeout, \*, loop=None) + + Wait for the single :class:`Future` or :ref:`coroutine object ` +- to complete, with timeout. If *timeout* is ``None``, block until the future ++ to complete with timeout. If *timeout* is ``None``, block until the future + completes. + +- Coroutine will be wrapped in :class:`Task`. ++ Coroutine objects are wrapped in a future using the ++ :meth:`BaseEventLoop.create_task` method. + + Returns result of the Future or coroutine. When a timeout occurs, it +- cancels the task and raises :exc:`TimeoutError`. To avoid the task ++ cancels the task and raises :exc:`asyncio.TimeoutError`. To avoid the task + cancellation, wrap it in :func:`shield`. + +- This function is a :ref:`coroutine `. ++ This function is a :ref:`coroutine `, usage:: + +- Usage:: ++ result = yield from asyncio.wait_for(fut, 60.0) + +- result = yield from asyncio.wait_for(fut, 60.0) +- +diff -r c0e311e010fc Doc/library/asyncio.rst +--- a/Doc/library/asyncio.rst ++++ b/Doc/library/asyncio.rst +@@ -39,12 +39,13 @@ + you absolutely, positively have to use a library that makes blocking + I/O calls. + +-Table of content: ++Table of contents: + + .. toctree:: + :maxdepth: 3 + + asyncio-eventloop.rst ++ asyncio-eventloops.rst + asyncio-task.rst + asyncio-protocol.rst + asyncio-stream.rst +@@ -54,6 +55,6 @@ + + .. seealso:: + +- The :mod:`asyncio` module was designed in the :PEP:`3156`. For a ++ The :mod:`asyncio` module was designed in :PEP:`3156`. For a + motivational primer on transports and protocols, see :PEP:`3153`. + +diff -r c0e311e010fc Doc/library/asyncore.rst +--- a/Doc/library/asyncore.rst ++++ b/Doc/library/asyncore.rst +@@ -216,6 +216,10 @@ + empty bytes object implies that the channel has been closed from the + other end. + ++ Note that :meth:`recv` may raise :exc:`BlockingIOError` , even though ++ :func:`select.select` or :func:`select.poll` has reported the socket ++ ready for reading. ++ + + .. method:: listen(backlog) + +diff -r c0e311e010fc Doc/library/collections.abc.rst +--- a/Doc/library/collections.abc.rst ++++ b/Doc/library/collections.abc.rst +@@ -173,7 +173,7 @@ + + (2) + To override the comparisons (presumably for speed, as the +- semantics are fixed), redefine :meth:`__le__` and ++ semantics are fixed), redefine :meth:`__le__` and :meth:`__ge__`, + then the other operations will automatically follow suit. + + (3) +diff -r c0e311e010fc Doc/library/contextlib.rst +--- a/Doc/library/contextlib.rst ++++ b/Doc/library/contextlib.rst +@@ -371,7 +371,7 @@ + with ExitStack() as stack: + for resource in resources: + stack.enter_context(resource) +- if need_special resource: ++ if need_special_resource(): + special = acquire_special_resource() + stack.callback(release_special_resource, special) + # Perform operations that use the acquired resources +diff -r c0e311e010fc Doc/library/datetime.rst +--- a/Doc/library/datetime.rst ++++ b/Doc/library/datetime.rst +@@ -1687,11 +1687,11 @@ + .. seealso:: + + `pytz `_ +- The standard library has no :class:`tzinfo` instances except for UTC, but +- there exists a third-party library which brings the *IANA timezone +- database* (also known as the Olson database) to Python: *pytz*. +- +- *pytz* contains up-to-date information and its usage is recommended. ++ The standard library has :class:`timezone` class for handling arbitrary ++ fixed offsets from UTC and :attr:`timezone.utc` as UTC timezone instance. ++ ++ *pytz* library brings the *IANA timezone database* (also known as the ++ Olson database) to Python and its usage is recommended. + + `IANA timezone database `_ + The Time Zone Database (often called tz or zoneinfo) contains code and +@@ -1728,6 +1728,8 @@ + *offset*, HH and MM are two digits of ``offset.hours`` and + ``offset.minutes`` respectively. + ++ .. versionadded:: 3.2 ++ + .. method:: timezone.utcoffset(dt) + + Return the fixed value specified when the :class:`timezone` instance is +diff -r c0e311e010fc Doc/library/dbm.rst +--- a/Doc/library/dbm.rst ++++ b/Doc/library/dbm.rst +@@ -222,6 +222,9 @@ + When the database has been opened in fast mode, this method forces any + unwritten data to be written to the disk. + ++ .. method:: gdbm.close() ++ ++ Close the ``gdbm`` database. + + :mod:`dbm.ndbm` --- Interface based on ndbm + ------------------------------------------- +@@ -253,7 +256,7 @@ + + .. function:: open(filename[, flag[, mode]]) + +- Open a dbm database and return a ``dbm`` object. The *filename* argument is the ++ Open a dbm database and return a ``ndbm`` object. The *filename* argument is the + name of the database file (without the :file:`.dir` or :file:`.pag` extensions). + + The optional *flag* argument must be one of these values: +@@ -278,6 +281,12 @@ + database has to be created. It defaults to octal ``0o666`` (and will be + modified by the prevailing umask). + ++ In addition to the dictionary-like methods, ``ndbm`` objects ++ provide the following method: ++ ++ .. method:: ndbm.close() ++ ++ Close the ``ndbm`` database. + + + :mod:`dbm.dumb` --- Portable DBM implementation +@@ -325,9 +334,14 @@ + + In addition to the methods provided by the + :class:`collections.abc.MutableMapping` class, :class:`dumbdbm` objects +- provide the following method: ++ provide the following methods: + + .. method:: dumbdbm.sync() + + Synchronize the on-disk directory and data files. This method is called + by the :meth:`Shelve.sync` method. ++ ++ .. method:: dumbdbm.close() ++ ++ Close the ``dumbdbm`` database. ++ +diff -r c0e311e010fc Doc/library/exceptions.rst +--- a/Doc/library/exceptions.rst ++++ b/Doc/library/exceptions.rst +@@ -274,9 +274,10 @@ + + Raised when the result of an arithmetic operation is too large to be + represented. This cannot occur for integers (which would rather raise +- :exc:`MemoryError` than give up). Because of the lack of standardization of +- floating point exception handling in C, most floating point operations also +- aren't checked. ++ :exc:`MemoryError` than give up). However, for historical reasons, ++ OverflowError is sometimes raised for integers that are outside a required ++ range. Because of the lack of standardization of floating point exception ++ handling in C, most floating point operations are not checked. + + + .. exception:: ReferenceError +@@ -457,10 +458,6 @@ + + .. exception:: IOError + +-.. exception:: VMSError +- +- Only available on VMS. +- + .. exception:: WindowsError + + Only available on Windows. +diff -r c0e311e010fc Doc/library/functions.rst +--- a/Doc/library/functions.rst ++++ b/Doc/library/functions.rst +@@ -410,6 +410,7 @@ + See :func:`ast.literal_eval` for a function that can safely evaluate strings + with expressions containing only literals. + ++.. index:: builtin: exec + + .. function:: exec(object[, globals[, locals]]) + +@@ -742,7 +743,8 @@ + .. function:: len(s) + + Return the length (the number of items) of an object. The argument may be a +- sequence (string, tuple or list) or a mapping (dictionary). ++ sequence (such as a string, bytes, tuple, list, or range) or a collection ++ (such as a dictionary, set, or frozen set). + + + .. _func-list: +diff -r c0e311e010fc Doc/library/hashlib.rst +--- a/Doc/library/hashlib.rst ++++ b/Doc/library/hashlib.rst +@@ -180,9 +180,9 @@ + ----------------------- + + Key derivation and key stretching algorithms are designed for secure password +-hashing. Naive algorithms such as ``sha1(password)`` are not resistant +-against brute-force attacks. A good password hashing function must be tunable, +-slow and include a salt. ++hashing. Naive algorithms such as ``sha1(password)`` are not resistant against ++brute-force attacks. A good password hashing function must be tunable, slow, and ++include a `salt `_. + + + .. function:: pbkdf2_hmac(name, password, salt, rounds, dklen=None) +@@ -197,8 +197,7 @@ + a proper source, e.g. :func:`os.urandom`. + + The number of *rounds* should be chosen based on the hash algorithm and +- computing power. As of 2013 a value of at least 100,000 rounds of SHA-256 +- have been suggested. ++ computing power. As of 2013, at least 100,000 rounds of SHA-256 is suggested. + + *dklen* is the length of the derived key. If *dklen* is ``None`` then the + digest size of the hash algorithm *name* is used, e.g. 64 for SHA-512. +@@ -210,9 +209,11 @@ + + .. versionadded:: 3.4 + +- .. note:: A fast implementation of *pbkdf2_hmac* is available with OpenSSL. +- The Python implementation uses an inline version of :mod:`hmac`. It is +- about three times slower and doesn't release the GIL. ++ .. note:: ++ ++ A fast implementation of *pbkdf2_hmac* is available with OpenSSL. The ++ Python implementation uses an inline version of :mod:`hmac`. It is about ++ three times slower and doesn't release the GIL. + + + .. seealso:: +diff -r c0e311e010fc Doc/library/hmac.rst +--- a/Doc/library/hmac.rst ++++ b/Doc/library/hmac.rst +@@ -25,7 +25,7 @@ + .. versionchanged:: 3.4 + Parameter *key* can be a bytes or bytearray object. + Parameter *msg* can be of any type supported by :mod:`hashlib`. +- Paramter *digestmod* can be the name of a hash algorithm. ++ Parameter *digestmod* can be the name of a hash algorithm. + + .. deprecated:: 3.4 + MD5 as implicit default digest for *digestmod* is deprecated. +diff -r c0e311e010fc Doc/library/io.rst +--- a/Doc/library/io.rst ++++ b/Doc/library/io.rst +@@ -289,7 +289,7 @@ + most *size* bytes will be read. + + The line terminator is always ``b'\n'`` for binary files; for text files, +- the *newlines* argument to :func:`open` can be used to select the line ++ the *newline* argument to :func:`open` can be used to select the line + terminator(s) recognized. + + .. method:: readlines(hint=-1) +@@ -353,6 +353,12 @@ + is usual for each of the lines provided to have a line separator at the + end. + ++ .. method:: __del__() ++ ++ Prepare for object destruction. :class:`IOBase` provides a default ++ implementation of this method that calls the instance's ++ :meth:`~IOBase.close` method. ++ + + .. class:: RawIOBase + +diff -r c0e311e010fc Doc/library/logging.handlers.rst +--- a/Doc/library/logging.handlers.rst ++++ b/Doc/library/logging.handlers.rst +@@ -850,10 +850,27 @@ + credentials, you should also specify secure=True so that your userid and + password are not passed in cleartext across the wire. + ++ .. method:: mapLogRecord(record) ++ ++ Provides a dictionary, based on ``record``, which is to be URL-encoded ++ and sent to the web server. The default implementation just returns ++ ``record.__dict__``. This method can be overridden if e.g. only a ++ subset of :class:`~logging.LogRecord` is to be sent to the web server, or ++ if more specific customization of what's sent to the server is required. + + .. method:: emit(record) + +- Sends the record to the Web server as a percent-encoded dictionary. ++ Sends the record to the Web server as an URL-encoded dictionary. The ++ :meth:`mapLogRecord` method is used to convert the record to the ++ dictionary to be sent. ++ ++ .. note:: Since preparing a record for sending it to a Web server is not ++ the same as a generic formatting operation, using ++ :meth:`~logging.Handler.setFormatter` to specify a ++ :class:`~logging.Formatter` for a :class:`HTTPHandler` has no effect. ++ Instead of calling :meth:`~logging.Handler.format`, this handler calls ++ :meth:`mapLogRecord` and then :func:`urllib.parse.urlencode` to encode the ++ dictionary in a form suitable for sending to a Web server. + + + .. _queue-handler: +diff -r c0e311e010fc Doc/library/logging.rst +--- a/Doc/library/logging.rst ++++ b/Doc/library/logging.rst +@@ -1049,6 +1049,11 @@ + of the defined levels is passed in, the corresponding string representation is + returned. Otherwise, the string 'Level %s' % lvl is returned. + ++ .. versionchanged:: 3.4 ++ In Python versions earlier than 3.4, this function could also be passed a ++ text level, and would return the corresponding numeric value of the level. ++ This undocumented behaviour was a mistake, and has been removed in Python ++ 3.4. + + .. function:: makeLogRecord(attrdict) + +diff -r c0e311e010fc Doc/library/multiprocessing.rst +--- a/Doc/library/multiprocessing.rst ++++ b/Doc/library/multiprocessing.rst +@@ -262,8 +262,10 @@ + + def f(l, i): + l.acquire() +- print('hello world', i) +- l.release() ++ try: ++ print('hello world', i) ++ finally: ++ l.release() + + if __name__ == '__main__': + lock = Lock() +@@ -396,7 +398,7 @@ + print(res.get(timeout=1)) # prints "100" + + # make worker sleep for 10 secs +- res = pool.apply_async(sleep, 10) ++ res = pool.apply_async(sleep, [10]) + print(res.get(timeout=1)) # raises multiprocessing.TimeoutError + + # exiting the 'with'-block has stopped the pool +diff -r c0e311e010fc Doc/library/os.path.rst +--- a/Doc/library/os.path.rst ++++ b/Doc/library/os.path.rst +@@ -188,7 +188,7 @@ + .. function:: islink(path) + + Return ``True`` if *path* refers to a directory entry that is a symbolic link. +- Always ``False`` if symbolic links are not supported. ++ Always ``False`` if symbolic links are not supported by the python runtime. + + + .. function:: ismount(path) +diff -r c0e311e010fc Doc/library/os.rst +--- a/Doc/library/os.rst ++++ b/Doc/library/os.rst +@@ -53,7 +53,7 @@ + .. data:: name + + The name of the operating system dependent module imported. The following +- names have currently been registered: ``'posix'``, ``'nt'``, ``'mac'``, ++ names have currently been registered: ``'posix'``, ``'nt'``, + ``'ce'``, ``'java'``. + + .. seealso:: +@@ -65,6 +65,7 @@ + + + .. _os-filenames: ++.. _filesystem-encoding: + + File Names, Command Line Arguments, and Environment Variables + ------------------------------------------------------------- +@@ -379,7 +380,7 @@ + + .. index:: single: user; id + +- Return the current process's user id. ++ Return the current process's real user id. + + Availability: Unix. + +@@ -764,8 +765,14 @@ + + .. function:: fstat(fd) + +- Return status for file descriptor *fd*, like :func:`~os.stat`. As of Python +- 3.3, this is equivalent to ``os.stat(fd)``. ++ Get the status of the file descriptor *fd*. Return a :class:`stat_result` ++ object. ++ ++ As of Python 3.3, this is equivalent to ``os.stat(fd)``. ++ ++ .. seealso:: ++ ++ The :func:`stat` function. + + Availability: Unix, Windows. + +@@ -1087,8 +1094,16 @@ + All platforms support sockets as *out* file descriptor, and some platforms + allow other types (e.g. regular file, pipe) as well. + ++ Cross-platform applications should not use *headers*, *trailers* and *flags* ++ arguments. ++ + Availability: Unix. + ++ .. note:: ++ ++ For a higher-level wrapper of :func:`sendfile`, see ++ :mod:`socket.socket.sendfile`. ++ + .. versionadded:: 3.3 + + +@@ -1569,17 +1584,25 @@ + Added support for specifying an open file descriptor for *path*. + + +-.. function:: lstat(path, *, dir_fd=None) ++.. function:: lstat(path, \*, dir_fd=None) + + Perform the equivalent of an :c:func:`lstat` system call on the given path. +- Similar to :func:`~os.stat`, but does not follow symbolic links. On +- platforms that do not support symbolic links, this is an alias for +- :func:`~os.stat`. As of Python 3.3, this is equivalent to ``os.stat(path, +- dir_fd=dir_fd, follow_symlinks=False)``. ++ Similar to :func:`~os.stat`, but does not follow symbolic links. Return a ++ :class:`stat_result` object. ++ ++ On platforms that do not support symbolic links, this is an alias for ++ :func:`~os.stat`. ++ ++ As of Python 3.3, this is equivalent to ``os.stat(path, dir_fd=dir_fd, ++ follow_symlinks=False)``. + + This function can also support :ref:`paths relative to directory descriptors + `. + ++ .. seealso:: ++ ++ The :func:`stat` function. ++ + .. versionchanged:: 3.2 + Added support for Windows 6.0 (Vista) symbolic links. + +@@ -1846,55 +1869,116 @@ + The *dir_fd* parameter. + + +-.. function:: stat(path, *, dir_fd=None, follow_symlinks=True) +- +- Perform the equivalent of a :c:func:`stat` system call on the given path. +- *path* may be specified as either a string or as an open file descriptor. +- (This function normally follows symlinks; to stat a symlink add the argument +- ``follow_symlinks=False``, or use :func:`lstat`.) +- +- The return value is an object whose attributes correspond roughly +- to the members of the :c:type:`stat` structure, namely: +- +- * :attr:`st_mode` - protection bits, +- * :attr:`st_ino` - inode number, +- * :attr:`st_dev` - device, +- * :attr:`st_nlink` - number of hard links, +- * :attr:`st_uid` - user id of owner, +- * :attr:`st_gid` - group id of owner, +- * :attr:`st_size` - size of file, in bytes, +- * :attr:`st_atime` - time of most recent access expressed in seconds, +- * :attr:`st_mtime` - time of most recent content modification +- expressed in seconds, +- * :attr:`st_ctime` - platform dependent; time of most recent metadata +- change on Unix, or the time of creation on Windows, expressed in seconds +- * :attr:`st_atime_ns` - time of most recent access +- expressed in nanoseconds as an integer, +- * :attr:`st_mtime_ns` - time of most recent content modification +- expressed in nanoseconds as an integer, +- * :attr:`st_ctime_ns` - platform dependent; time of most recent metadata +- change on Unix, or the time of creation on Windows, +- expressed in nanoseconds as an integer +- +- On some Unix systems (such as Linux), the following attributes may also be +- available: +- +- * :attr:`st_blocks` - number of 512-byte blocks allocated for file +- * :attr:`st_blksize` - filesystem blocksize for efficient file system I/O +- * :attr:`st_rdev` - type of device if an inode device +- * :attr:`st_flags` - user defined flags for file +- +- On other Unix systems (such as FreeBSD), the following attributes may be +- available (but may be only filled out if root tries to use them): +- +- * :attr:`st_gen` - file generation number +- * :attr:`st_birthtime` - time of file creation +- +- On Mac OS systems, the following attributes may also be available: +- +- * :attr:`st_rsize` +- * :attr:`st_creator` +- * :attr:`st_type` ++.. function:: stat(path, \*, dir_fd=None, follow_symlinks=True) ++ ++ Get the status of a file or a file descriptor. Perform the equivalent of a ++ :c:func:`stat` system call on the given path. *path* may be specified as ++ either a string or as an open file descriptor. Return a :class:`stat_result` ++ object. ++ ++ This function normally follows symlinks; to stat a symlink add the argument ++ ``follow_symlinks=False``, or use :func:`lstat`. ++ ++ This function can support :ref:`specifying a file descriptor ` and ++ :ref:`not following symlinks `. ++ ++ .. index:: module: stat ++ ++ Example:: ++ ++ >>> import os ++ >>> statinfo = os.stat('somefile.txt') ++ >>> statinfo ++ os.stat_result(st_mode=33188, st_ino=7876932, st_dev=234881026, ++ st_nlink=1, st_uid=501, st_gid=501, st_size=264, st_atime=1297230295, ++ st_mtime=1297230027, st_ctime=1297230027) ++ >>> statinfo.st_size ++ 264 ++ ++ Availability: Unix, Windows. ++ ++ .. seealso:: ++ ++ :func:`fstat` and :func:`lstat` functions. ++ ++ .. versionadded:: 3.3 ++ Added the *dir_fd* and *follow_symlinks* arguments, specifying a file ++ descriptor instead of a path. ++ ++ ++.. class:: stat_result ++ ++ Object whose attributes correspond roughly to the members of the ++ :c:type:`stat` structure. It is used for the result of :func:`os.stat`, ++ :func:`os.fstat` and :func:`os.lstat`. ++ ++ Attributes: ++ ++ .. attribute:: st_mode ++ ++ File mode: file type and file mode bits (permissions). ++ ++ .. attribute:: st_ino ++ ++ Inode number. ++ ++ .. attribute:: st_dev ++ ++ Identifier of the device on which this file resides. ++ ++ .. attribute:: st_nlink ++ ++ Number of hard links. ++ ++ .. attribute:: st_uid ++ ++ User identifier of the file owner. ++ ++ .. attribute:: st_gid ++ ++ Group identifier of the file owner. ++ ++ .. attribute:: st_size ++ ++ Size of the file in bytes, if it is a regular file or a symbolic link. ++ The size of a symbolic link is the length of the pathname it contains, ++ without a terminating null byte. ++ ++ Timestamps: ++ ++ .. attribute:: st_atime ++ ++ Time of most recent access expressed in seconds. ++ ++ .. attribute:: st_mtime ++ ++ Time of most recent content modification expressed in seconds. ++ ++ .. attribute:: st_ctime ++ ++ Platform dependent: ++ ++ * the time of most recent metadata change on Unix, ++ * the time of creation on Windows, expressed in seconds. ++ ++ .. attribute:: st_atime_ns ++ ++ Time of most recent access expressed in nanoseconds as an integer. ++ ++ .. attribute:: st_mtime_ns ++ ++ Time of most recent content modification expressed in nanoseconds as an ++ integer. ++ ++ .. attribute:: st_ctime_ns ++ ++ Platform dependent: ++ ++ * the time of most recent metadata change on Unix, ++ * the time of creation on Windows, expressed in nanoseconds as an ++ integer. ++ ++ See also the :func:`stat_float_times` function. + + .. note:: + +@@ -1904,6 +1988,7 @@ + or FAT32 file systems, :attr:`st_mtime` has 2-second resolution, and + :attr:`st_atime` has only 1-day resolution. See your operating system + documentation for details. ++ + Similarly, although :attr:`st_atime_ns`, :attr:`st_mtime_ns`, + and :attr:`st_ctime_ns` are always expressed in nanoseconds, many + systems do not provide nanosecond precision. On systems that do +@@ -1913,41 +1998,68 @@ + If you need the exact timestamps you should always use + :attr:`st_atime_ns`, :attr:`st_mtime_ns`, and :attr:`st_ctime_ns`. + +- For backward compatibility, the return value of :func:`~os.stat` is also ++ On some Unix systems (such as Linux), the following attributes may also be ++ available: ++ ++ .. attribute:: st_blocks ++ ++ Number of 512-byte blocks allocated for file. ++ This may be smaller than :attr:`st_size`/512 when the file has holes. ++ ++ .. attribute:: st_blksize ++ ++ "Preferred" blocksize for efficient file system I/O. Writing to a file in ++ smaller chunks may cause an inefficient read-modify-rewrite. ++ ++ .. attribute:: st_rdev ++ ++ Type of device if an inode device. ++ ++ .. attribute:: st_flags ++ ++ User defined flags for file. ++ ++ On other Unix systems (such as FreeBSD), the following attributes may be ++ available (but may be only filled out if root tries to use them): ++ ++ .. attribute:: st_gen ++ ++ File generation number. ++ ++ .. attribute:: st_birthtime ++ ++ Time of file creation. ++ ++ On Mac OS systems, the following attributes may also be available: ++ ++ .. attribute:: st_rsize ++ ++ Real size of the file. ++ ++ .. attribute:: st_creator ++ ++ Creator of the file. ++ ++ .. attribute:: st_type ++ ++ File type. ++ ++ The standard module :mod:`stat` defines functions and constants that are ++ useful for extracting information from a :c:type:`stat` structure. (On ++ Windows, some items are filled with dummy values.) ++ ++ For backward compatibility, a :class:`stat_result` instance is also + accessible as a tuple of at least 10 integers giving the most important (and + portable) members of the :c:type:`stat` structure, in the order + :attr:`st_mode`, :attr:`st_ino`, :attr:`st_dev`, :attr:`st_nlink`, + :attr:`st_uid`, :attr:`st_gid`, :attr:`st_size`, :attr:`st_atime`, + :attr:`st_mtime`, :attr:`st_ctime`. More items may be added at the end by +- some implementations. +- +- This function can support :ref:`specifying a file descriptor ` and +- :ref:`not following symlinks `. +- +- .. index:: module: stat +- +- The standard module :mod:`stat` defines functions and constants that are useful +- for extracting information from a :c:type:`stat` structure. (On Windows, some +- items are filled with dummy values.) +- +- Example:: +- +- >>> import os +- >>> statinfo = os.stat('somefile.txt') +- >>> statinfo +- posix.stat_result(st_mode=33188, st_ino=7876932, st_dev=234881026, +- st_nlink=1, st_uid=501, st_gid=501, st_size=264, st_atime=1297230295, +- st_mtime=1297230027, st_ctime=1297230027) +- >>> statinfo.st_size +- 264 +- +- Availability: Unix, Windows. ++ some implementations. For compatibility with older Python versions, ++ accessing :class:`stat_result` as a tuple always returns integers. + + .. versionadded:: 3.3 +- Added the *dir_fd* and *follow_symlinks* arguments, +- specifying a file descriptor instead of a path, +- and the :attr:`st_atime_ns`, :attr:`st_mtime_ns`, +- and :attr:`st_ctime_ns` members. ++ Added the :attr:`st_atime_ns`, :attr:`st_mtime_ns`, and ++ :attr:`st_ctime_ns` members. + + + .. function:: stat_float_times([newvalue]) +@@ -2228,9 +2340,11 @@ + + If optional argument *topdown* is ``True`` or not specified, the triple for a + directory is generated before the triples for any of its subdirectories +- (directories are generated top-down). If *topdown* is ``False``, the triple for a +- directory is generated after the triples for all of its subdirectories +- (directories are generated bottom-up). ++ (directories are generated top-down). If *topdown* is ``False``, the triple ++ for a directory is generated after the triples for all of its subdirectories ++ (directories are generated bottom-up). No matter the value of *topdown*, the ++ list of subdirectories is retrieved before the tuples for the directory and ++ its subdirectories are generated. + + When *topdown* is ``True``, the caller can modify the *dirnames* list in-place + (perhaps using :keyword:`del` or slice assignment), and :func:`walk` will only +@@ -2730,10 +2844,27 @@ + Availability: Unix. + + +-.. function:: popen(...) +- +- Run child processes, returning opened pipes for communications. These functions +- are described in section :ref:`os-newstreams`. ++.. function:: popen(command, mode='r', buffering=-1) ++ ++ Open a pipe to or from *command*. The return value is an open file object ++ connected to the pipe, which can be read or written depending on whether *mode* ++ is ``'r'`` (default) or ``'w'``. The *buffering* argument has the same meaning as ++ the corresponding argument to the built-in :func:`open` function. The ++ returned file object reads or writes text strings rather than bytes. ++ ++ The ``close`` method returns :const:`None` if the subprocess exited ++ successfully, or the subprocess's return code if there was an ++ error. On POSIX systems, if the return code is positive it ++ represents the return value of the process left-shifted by one ++ byte. If the return code is negative, the process was terminated ++ by the signal given by the negated value of the return code. (For ++ example, the return value might be ``- signal.SIGKILL`` if the ++ subprocess was killed.) On Windows systems, the return value ++ contains the signed integer return code from the child process. ++ ++ This is implemented using :class:`subprocess.Popen`; see that class's ++ documentation for more powerful ways to manage and communicate with ++ subprocesses. + + + .. function:: spawnl(mode, path, ...) +diff -r c0e311e010fc Doc/library/ossaudiodev.rst +--- a/Doc/library/ossaudiodev.rst ++++ b/Doc/library/ossaudiodev.rst +@@ -407,7 +407,7 @@ + (silent) to 100 (full volume). If the control is monophonic, a 2-tuple is still + returned, but both volumes are the same. + +- Raises :exc:`OSSAudioError` if an invalid control was is specified, or ++ Raises :exc:`OSSAudioError` if an invalid control is specified, or + :exc:`OSError` if an unsupported control is specified. + + +diff -r c0e311e010fc Doc/library/quopri.rst +--- a/Doc/library/quopri.rst ++++ b/Doc/library/quopri.rst +@@ -24,9 +24,8 @@ + .. function:: decode(input, output, header=False) + + Decode the contents of the *input* file and write the resulting decoded binary +- data to the *output* file. *input* and *output* must be :term:`file objects +- `. *input* will be read until ``input.readline()`` returns an +- empty string. If the optional argument *header* is present and true, underscore ++ data to the *output* file. *input* and *output* must be :term:`binary file objects ++ `. If the optional argument *header* is present and true, underscore + will be decoded as space. This is used to decode "Q"-encoded headers as + described in :rfc:`1522`: "MIME (Multipurpose Internet Mail Extensions) + Part Two: Message Header Extensions for Non-ASCII Text". +@@ -34,27 +33,28 @@ + + .. function:: encode(input, output, quotetabs, header=False) + +- Encode the contents of the *input* file and write the resulting quoted-printable +- data to the *output* file. *input* and *output* must be :term:`file objects +- `. *input* will be read until ``input.readline()`` returns an +- empty string. *quotetabs* is a flag which controls whether to encode embedded +- spaces and tabs; when true it encodes such embedded whitespace, and when +- false it leaves them unencoded. Note that spaces and tabs appearing at the +- end of lines are always encoded, as per :rfc:`1521`. *header* is a flag +- which controls if spaces are encoded as underscores as per :rfc:`1522`. ++ Encode the contents of the *input* file and write the resulting quoted- ++ printable data to the *output* file. *input* and *output* must be ++ :term:`binary file objects `. *quotetabs*, a flag which controls ++ whether to encode embedded spaces and tabs must be provideda and when true it ++ encodes such embedded whitespace, and when false it leaves them unencoded. ++ Note that spaces and tabs appearing at the end of lines are always encoded, ++ as per :rfc:`1521`. *header* is a flag which controls if spaces are encoded ++ as underscores as per :rfc:`1522`. + + + .. function:: decodestring(s, header=False) + +- Like :func:`decode`, except that it accepts a source string and returns the +- corresponding decoded string. ++ Like :func:`decode`, except that it accepts a source :class:`bytes` and ++ returns the corresponding decoded :class:`bytes`. + + + .. function:: encodestring(s, quotetabs=False, header=False) + +- Like :func:`encode`, except that it accepts a source string and returns the +- corresponding encoded string. *quotetabs* and *header* are optional +- (defaulting to ``False``), and are passed straight through to :func:`encode`. ++ Like :func:`encode`, except that it accepts a source :class:`bytes` and ++ returns the corresponding encoded :class:`bytes`. By default, it sends a ++ False value to *quotetabs* parameter of the :func:`encode` function. ++ + + + .. seealso:: +diff -r c0e311e010fc Doc/library/re.rst +--- a/Doc/library/re.rst ++++ b/Doc/library/re.rst +@@ -458,8 +458,8 @@ + .. function:: compile(pattern, flags=0) + + Compile a regular expression pattern into a regular expression object, which +- can be used for matching using its :func:`match` and :func:`search` methods, +- described below. ++ can be used for matching using its :func:`~regex.match` and ++ :func:`~regex.search` methods, described below. + + The expression's behaviour can be modified by specifying a *flags* value. + Values can be any of the following variables, combined using bitwise OR (the +@@ -563,7 +563,7 @@ + + .. function:: search(pattern, string, flags=0) + +- Scan through *string* looking for a location where the regular expression ++ Scan through *string* looking for the first location where the regular expression + *pattern* produces a match, and return a corresponding :ref:`match object + `. Return ``None`` if no position in the string matches the + pattern; note that this is different from finding a zero-length match at some +@@ -1340,9 +1340,9 @@ + ('ASSIGN', r':='), # Assignment operator + ('END', r';'), # Statement terminator + ('ID', r'[A-Za-z]+'), # Identifiers +- ('OP', r'[+*\/\-]'), # Arithmetic operators ++ ('OP', r'[+\-*/]'), # Arithmetic operators + ('NEWLINE', r'\n'), # Line endings +- ('SKIP', r'[ \t]'), # Skip over spaces and tabs ++ ('SKIP', r'[ \t]+'), # Skip over spaces and tabs + ] + tok_regex = '|'.join('(?P<%s>%s)' % pair for pair in token_specification) + get_token = re.compile(tok_regex).match +diff -r c0e311e010fc Doc/library/runpy.rst +--- a/Doc/library/runpy.rst ++++ b/Doc/library/runpy.rst +@@ -28,6 +28,9 @@ + + .. function:: run_module(mod_name, init_globals=None, run_name=None, alter_sys=False) + ++ .. index:: ++ module: __main__ ++ + Execute the code of the specified module and return the resulting module + globals dictionary. The module's code is first located using the standard + import mechanism (refer to :pep:`302` for details) and then executed in a +@@ -87,6 +90,9 @@ + + .. function:: run_path(file_path, init_globals=None, run_name=None) + ++ .. index:: ++ module: __main__ ++ + Execute the code at the named filesystem location and return the resulting + module globals dictionary. As with a script name supplied to the CPython + command line, the supplied path may refer to a Python source file, a +diff -r c0e311e010fc Doc/library/smtplib.rst +--- a/Doc/library/smtplib.rst ++++ b/Doc/library/smtplib.rst +@@ -32,7 +32,8 @@ + than a success code, an :exc:`SMTPConnectError` is raised. The optional + *timeout* parameter specifies a timeout in seconds for blocking operations + like the connection attempt (if not specified, the global default timeout +- setting will be used). The optional source_address parameter allows to bind ++ setting will be used). If the timeout expires, :exc:`socket.timeout` is ++ raised. The optional source_address parameter allows to bind + to some specific source address in a machine with multiple network + interfaces, and/or to some specific source TCP port. It takes a 2-tuple + (host, port), for the socket to bind to as its source address before +diff -r c0e311e010fc Doc/library/sqlite3.rst +--- a/Doc/library/sqlite3.rst ++++ b/Doc/library/sqlite3.rst +@@ -646,7 +646,7 @@ + + .. method:: keys + +- This method returns a tuple of column names. Immediately after a query, ++ This method returns a list of column names. Immediately after a query, + it is the first member of each tuple in :attr:`Cursor.description`. + + Let's assume we initialize a table as in the example given above:: +diff -r c0e311e010fc Doc/library/ssl.rst +--- a/Doc/library/ssl.rst ++++ b/Doc/library/ssl.rst +@@ -787,7 +787,7 @@ + Perform the SSL setup handshake. + + .. versionchanged:: 3.4 +- The handshake method also performce :func:`match_hostname` when the ++ The handshake method also performs :func:`match_hostname` when the + :attr:`~SSLContext.check_hostname` attribute of the socket's + :attr:`~SSLSocket.context` is true. + +@@ -1111,7 +1111,7 @@ + returned. Other return values will result in a TLS fatal error with + :const:`ALERT_DESCRIPTION_INTERNAL_ERROR`. + +- If there is a IDNA decoding error on the server name, the TLS connection ++ If there is an IDNA decoding error on the server name, the TLS connection + will terminate with an :const:`ALERT_DESCRIPTION_INTERNAL_ERROR` fatal TLS + alert message to the client. + +@@ -1220,8 +1220,8 @@ + context.load_default_certs() + + s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) +- ssl_sock = context.wrap_socket(s, server_hostname='www.verisign.com'): +- ssl_sock.connect(('www.verisign.com', 443)) ++ ssl_sock = context.wrap_socket(s, server_hostname='www.verisign.com') ++ ssl_sock.connect(('www.verisign.com', 443)) + + .. versionadded:: 3.4 + +diff -r c0e311e010fc Doc/library/stdtypes.rst +--- a/Doc/library/stdtypes.rst ++++ b/Doc/library/stdtypes.rst +@@ -3031,8 +3031,8 @@ + If no positional argument is given, an empty dictionary is created. + If a positional argument is given and it is a mapping object, a dictionary + is created with the same key-value pairs as the mapping object. Otherwise, +- the positional argument must be an :term:`iterator` object. Each item in +- the iterable must itself be an iterator with exactly two objects. The ++ the positional argument must be an :term:`iterable` object. Each item in ++ the iterable must itself be an iterable with exactly two objects. The + first object of each item becomes a key in the new dictionary, and the + second object the corresponding value. If a key occurs more than once, the + last value for that key becomes the corresponding value in the new +diff -r c0e311e010fc Doc/library/tarfile.rst +--- a/Doc/library/tarfile.rst ++++ b/Doc/library/tarfile.rst +@@ -81,6 +81,10 @@ + If *fileobj* is specified, it is used as an alternative to a :term:`file object` + opened in binary mode for *name*. It is supposed to be at position 0. + ++ For modes ``'w:gz'``, ``'r:gz'``, ``'w:bz2'``, ``'r:bz2'``, :func:`tarfile.open` ++ accepts the keyword argument *compresslevel* to specify the compression level of ++ the file. ++ + For special purposes, there is a second format for *mode*: + ``'filemode|[compression]'``. :func:`tarfile.open` will return a :class:`TarFile` + object that processes its data as a stream of blocks. No random seeking will +@@ -292,7 +296,7 @@ + will be added as a pax global header if *format* is :const:`PAX_FORMAT`. + + +-.. method:: TarFile.open(...) ++.. classmethod:: TarFile.open(...) + + Alternative constructor. The :func:`tarfile.open` function is actually a + shortcut to this classmethod. +@@ -509,7 +513,7 @@ + :const:`AREGTYPE`, :const:`LNKTYPE`, :const:`SYMTYPE`, :const:`DIRTYPE`, + :const:`FIFOTYPE`, :const:`CONTTYPE`, :const:`CHRTYPE`, :const:`BLKTYPE`, + :const:`GNUTYPE_SPARSE`. To determine the type of a :class:`TarInfo` object +- more conveniently, use the ``is_*()`` methods below. ++ more conveniently, use the ``is*()`` methods below. + + + .. attribute:: TarInfo.linkname +diff -r c0e311e010fc Doc/library/test.rst +--- a/Doc/library/test.rst ++++ b/Doc/library/test.rst +@@ -461,7 +461,7 @@ + .. function:: make_bad_fd() + + Create an invalid file descriptor by opening and closing a temporary file, +- and returning its descripor. ++ and returning its descriptor. + + + .. function:: import_module(name, deprecated=False) +@@ -554,6 +554,21 @@ + run simultaneously, which is a problem for buildbots. + + ++.. function:: load_package_tests(pkg_dir, loader, standard_tests, pattern) ++ ++ Generic implementation of the :mod:`unittest` ``load_tests`` protocol for ++ use in test packages. *pkg_dir* is the root directory of the package; ++ *loader*, *standard_tests*, and *pattern* are the arguments expected by ++ ``load_tests``. In simple cases, the test package's ``__init__.py`` ++ can be the following:: ++ ++ import os ++ from test.support import load_package_tests ++ ++ def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) ++ ++ + The :mod:`test.support` module defines the following classes: + + .. class:: TransientResource(exc, **kwargs) +diff -r c0e311e010fc Doc/library/tokenize.rst +--- a/Doc/library/tokenize.rst ++++ b/Doc/library/tokenize.rst +@@ -131,6 +131,24 @@ + + .. versionadded:: 3.2 + ++.. exception:: TokenError ++ ++ Raised when either a docstring or expression that may be split over several ++ lines is not completed anywhere in the file, for example:: ++ ++ """Beginning of ++ docstring ++ ++ or:: ++ ++ [1, ++ 2, ++ 3 ++ ++Note that unclosed single-quoted strings do not cause an error to be ++raised. They are tokenized as ``ERRORTOKEN``, followed by the tokenization of ++their contents. ++ + + .. _tokenize-cli: + +diff -r c0e311e010fc Doc/library/turtle.rst +--- a/Doc/library/turtle.rst ++++ b/Doc/library/turtle.rst +@@ -1981,7 +1981,7 @@ + :param startx: if positive, starting position in pixels from the left + edge of the screen, if negative from the right edge, if None, + center window horizontally +- :param startx: if positive, starting position in pixels from the top ++ :param starty: if positive, starting position in pixels from the top + edge of the screen, if negative from the bottom edge, if None, + center window vertically + +diff -r c0e311e010fc Doc/make.bat +--- a/Doc/make.bat ++++ b/Doc/make.bat +@@ -76,6 +76,15 @@ + cmd /C %SPHINXBUILD% %SPHINXOPTS% -b%1 -dbuild\doctrees . %BUILDDIR%\%* + + if "%1" EQU "htmlhelp" ( ++ if not exist "%HTMLHELP%" ( ++ echo. ++ echo.The HTML Help Workshop was not found. Set the HTMLHELP variable ++ echo.to the path to hhc.exe or download and install it from ++ echo.http://msdn.microsoft.com/en-us/library/ms669985 ++ rem Set errorlevel to 1 and exit ++ cmd /C exit /b 1 ++ goto end ++ ) + cmd /C "%HTMLHELP%" build\htmlhelp\python%DISTVERSION:.=%.hhp + rem hhc.exe seems to always exit with code 1, reset to 0 for less than 2 + if not errorlevel 2 cmd /C exit /b 0 +diff -r c0e311e010fc Doc/reference/compound_stmts.rst +--- a/Doc/reference/compound_stmts.rst ++++ b/Doc/reference/compound_stmts.rst +@@ -22,14 +22,14 @@ + single: clause + single: suite + +-Compound statements consist of one or more 'clauses.' A clause consists of a ++A compound statement consists of one or more 'clauses.' A clause consists of a + header and a 'suite.' The clause headers of a particular compound statement are + all at the same indentation level. Each clause header begins with a uniquely + identifying keyword and ends with a colon. A suite is a group of statements + controlled by a clause. A suite can be one or more semicolon-separated simple + statements on the same line as the header, following the header's colon, or it + can be one or more indented statements on subsequent lines. Only the latter +-form of suite can contain nested compound statements; the following is illegal, ++form of a suite can contain nested compound statements; the following is illegal, + mostly because it wouldn't be clear to which :keyword:`if` clause a following + :keyword:`else` clause would belong:: + +@@ -156,8 +156,8 @@ + + The expression list is evaluated once; it should yield an iterable object. An + iterator is created for the result of the ``expression_list``. The suite is +-then executed once for each item provided by the iterator, in the order of +-ascending indices. Each item in turn is assigned to the target list using the ++then executed once for each item provided by the iterator, in the order returned ++by the iterator. Each item in turn is assigned to the target list using the + standard rules for assignments (see :ref:`assignment`), and then the suite is + executed. When the items are exhausted (which is immediately when the sequence + is empty or an iterator raises a :exc:`StopIteration` exception), the suite in +@@ -170,17 +170,25 @@ + A :keyword:`break` statement executed in the first suite terminates the loop + without executing the :keyword:`else` clause's suite. A :keyword:`continue` + statement executed in the first suite skips the rest of the suite and continues +-with the next item, or with the :keyword:`else` clause if there was no next ++with the next item, or with the :keyword:`else` clause if there is no next + item. + +-The suite may assign to the variable(s) in the target list; this does not affect +-the next item assigned to it. ++The for-loop makes assignments to the variables(s) in the target list. ++This overwrites all previous assignments to those variables including ++those made in the suite of the for-loop:: ++ ++ for i in range(10): ++ print(i) ++ i = 5 # this will not affect the for-loop ++ # because i will be overwritten with the next ++ # index in the range ++ + + .. index:: + builtin: range + + Names in the target list are not deleted when the loop is finished, but if the +-sequence is empty, it will not have been assigned to at all by the loop. Hint: ++sequence is empty, they will not have been assigned to at all by the loop. Hint: + the built-in function :func:`range` returns an iterator of integers suitable to + emulate the effect of Pascal's ``for i := a to b do``; e.g., ``list(range(3))`` + returns the list ``[0, 1, 2]``. +@@ -284,7 +292,7 @@ + object: traceback + + Before an except clause's suite is executed, details about the exception are +-stored in the :mod:`sys` module and can be access via :func:`sys.exc_info`. ++stored in the :mod:`sys` module and can be accessed via :func:`sys.exc_info`. + :func:`sys.exc_info` returns a 3-tuple consisting of the exception class, the + exception instance and a traceback object (see section :ref:`types`) identifying + the point in the program where the exception occurred. :func:`sys.exc_info` +@@ -461,7 +469,7 @@ + decorator: "@" `dotted_name` ["(" [`parameter_list` [","]] ")"] NEWLINE + dotted_name: `identifier` ("." `identifier`)* + parameter_list: (`defparameter` ",")* +- : ( "*" [`parameter`] ("," `defparameter`)* ["," "**" `parameter`] ++ : | "*" [`parameter`] ("," `defparameter`)* ["," "**" `parameter`] + : | "**" `parameter` + : | `defparameter` [","] ) + parameter: `identifier` [":" `expression`] +diff -r c0e311e010fc Doc/reference/datamodel.rst +--- a/Doc/reference/datamodel.rst ++++ b/Doc/reference/datamodel.rst +@@ -77,7 +77,7 @@ + module for information on controlling the collection of cyclic garbage. + Other implementations act differently and CPython may change. + Do not depend on immediate finalization of objects when they become +- unreachable (ex: always close files). ++ unreachable (so you should always close files explicitly). + + Note that the use of the implementation's tracing or debugging facilities may + keep objects alive that would normally be collectable. Also note that catching +@@ -285,16 +285,17 @@ + single: integer + single: Unicode + +- A string is a sequence of values that represent Unicode codepoints. +- All the codepoints in range ``U+0000 - U+10FFFF`` can be represented +- in a string. Python doesn't have a :c:type:`chr` type, and +- every character in the string is represented as a string object +- with length ``1``. The built-in function :func:`ord` converts a +- character to its codepoint (as an integer); :func:`chr` converts +- an integer in range ``0 - 10FFFF`` to the corresponding character. ++ A string is a sequence of values that represent Unicode code points. ++ All the code points in the range ``U+0000 - U+10FFFF`` can be ++ represented in a string. Python doesn't have a :c:type:`char` type; ++ instead, every code point in the string is represented as a string ++ object with length ``1``. The built-in function :func:`ord` ++ converts a code point from its string form to an integer in the ++ range ``0 - 10FFFF``; :func:`chr` converts an integer in the range ++ ``0 - 10FFFF`` to the corresponding length ``1`` string object. + :meth:`str.encode` can be used to convert a :class:`str` to +- :class:`bytes` using the given encoding, and :meth:`bytes.decode` can +- be used to achieve the opposite. ++ :class:`bytes` using the given text encoding, and ++ :meth:`bytes.decode` can be used to achieve the opposite. + + Tuples + .. index:: +@@ -1722,7 +1723,7 @@ + locking/synchronization. + + Here is an example of a metaclass that uses an :class:`collections.OrderedDict` +-to remember the order that class members were defined:: ++to remember the order that class variables are defined:: + + class OrderedClass(type): + +@@ -2102,9 +2103,9 @@ + + .. note:: + +- When :meth:`__index__` is defined, :meth:`__int__` should also be defined, +- and both shuld return the same value, in order to have a coherent integer +- type class. ++ In order to have a coherent integer type class, when :meth:`__index__` is ++ defined :meth:`__int__` should also be defined, and both should return ++ the same value. + + + .. _context-managers: +diff -r c0e311e010fc Doc/reference/executionmodel.rst +--- a/Doc/reference/executionmodel.rst ++++ b/Doc/reference/executionmodel.rst +@@ -31,11 +31,11 @@ + A :dfn:`block` is a piece of Python program text that is executed as a unit. + The following are blocks: a module, a function body, and a class definition. + Each command typed interactively is a block. A script file (a file given as +-standard input to the interpreter or specified on the interpreter command line +-the first argument) is a code block. A script command (a command specified on +-the interpreter command line with the '**-c**' option) is a code block. The +-string argument passed to the built-in functions :func:`eval` and :func:`exec` +-is a code block. ++standard input to the interpreter or specified as a command line argument to the ++interpreter) is a code block. A script command (a command specified on the ++interpreter command line with the '**-c**' option) is a code block. The string ++argument passed to the built-in functions :func:`eval` and :func:`exec` is a ++code block. + + .. index:: pair: execution; frame + +@@ -77,7 +77,7 @@ + single: UnboundLocalError + + When a name is not found at all, a :exc:`NameError` exception is raised. If the +-name refers to a local variable that has not been bound, a ++name refers to a local variable that has not been bound, an + :exc:`UnboundLocalError` exception is raised. :exc:`UnboundLocalError` is a + subclass of :exc:`NameError`. + +diff -r c0e311e010fc Doc/reference/expressions.rst +--- a/Doc/reference/expressions.rst ++++ b/Doc/reference/expressions.rst +@@ -29,7 +29,7 @@ + + When a description of an arithmetic operator below uses the phrase "the numeric + arguments are converted to a common type," this means that the operator +-implementation for built-in types works that way: ++implementation for built-in types works as follows: + + * If either argument is a complex number, the other is converted to complex; + +@@ -38,8 +38,9 @@ + + * otherwise, both must be integers and no conversion is necessary. + +-Some additional rules apply for certain operators (e.g., a string left argument +-to the '%' operator). Extensions must define their own conversion behavior. ++Some additional rules apply for certain operators (e.g., a string as a left ++argument to the '%' operator). Extensions must define their own conversion ++behavior. + + + .. _atoms: +@@ -183,7 +184,7 @@ + each time the innermost block is reached. + + Note that the comprehension is executed in a separate scope, so names assigned +-to in the target list don't "leak" in the enclosing scope. ++to in the target list don't "leak" into the enclosing scope. + + + .. _lists: +@@ -293,7 +294,7 @@ + brackets or curly braces. + + Variables used in the generator expression are evaluated lazily when the +-:meth:`~generator.__next__` method is called for generator object (in the same ++:meth:`~generator.__next__` method is called for the generator object (in the same + fashion as normal generators). However, the leftmost :keyword:`for` clause is + immediately evaluated, so that an error produced by it can be seen before any + other possible error in the code that handles the generator expression. +@@ -302,7 +303,7 @@ + range(10) for y in bar(x))``. + + The parentheses can be omitted on calls with only one argument. See section +-:ref:`calls` for the detail. ++:ref:`calls` for details. + + + .. _yieldexpr: +@@ -327,12 +328,12 @@ + generator. That generator then controls the execution of a generator function. + The execution starts when one of the generator's methods is called. At that + time, the execution proceeds to the first yield expression, where it is +-suspended again, returning the value of :token:`expression_list` to generator's ++suspended again, returning the value of :token:`expression_list` to the generator's + caller. By suspended, we mean that all local state is retained, including the + current bindings of local variables, the instruction pointer, and the internal + evaluation stack. When the execution is resumed by calling one of the + generator's methods, the function can proceed exactly as if the yield expression +-was just another external call. The value of the yield expression after ++were just another external call. The value of the yield expression after + resuming depends on the method which resumed the execution. If + :meth:`~generator.__next__` is used (typically via either a :keyword:`for` or + the :func:`next` builtin) then the result is :const:`None`. Otherwise, if +@@ -344,10 +345,10 @@ + All of this makes generator functions quite similar to coroutines; they yield + multiple times, they have more than one entry point and their execution can be + suspended. The only difference is that a generator function cannot control +-where should the execution continue after it yields; the control is always ++where the execution should continue after it yields; the control is always + transferred to the generator's caller. + +-yield expressions are allowed in the :keyword:`try` clause of a :keyword:`try` ++Yield expressions are allowed in the :keyword:`try` clause of a :keyword:`try` + ... :keyword:`finally` construct. If the generator is not resumed before it is + finalized (by reaching a zero reference count or by being garbage collected), + the generator-iterator's :meth:`~generator.close` method will be called, +@@ -430,7 +431,7 @@ + + .. method:: generator.throw(type[, value[, traceback]]) + +- Raises an exception of type ``type`` at the point where generator was paused, ++ Raises an exception of type ``type`` at the point where the generator was paused, + and returns the next value yielded by the generator function. If the generator + exits without yielding another value, a :exc:`StopIteration` exception is + raised. If the generator function does not catch the passed-in exception, or +@@ -520,11 +521,11 @@ + + The primary must evaluate to an object of a type that supports attribute + references, which most objects do. This object is then asked to produce the +-attribute whose name is the identifier (which can be customized by overriding +-the :meth:`__getattr__` method). If this attribute is not available, the +-exception :exc:`AttributeError` is raised. Otherwise, the type and value of the +-object produced is determined by the object. Multiple evaluations of the same +-attribute reference may yield different objects. ++attribute whose name is the identifier. This production can be customized by ++overriding the :meth:`__getattr__` method. If this attribute is not available, ++the exception :exc:`AttributeError` is raised. Otherwise, the type and value of ++the object produced is determined by the object. Multiple evaluations of the ++same attribute reference may yield different objects. + + + .. _subscriptions: +@@ -549,9 +550,9 @@ + .. productionlist:: + subscription: `primary` "[" `expression_list` "]" + +-The primary must evaluate to an object that supports subscription, e.g. a list +-or dictionary. User-defined objects can support subscription by defining a +-:meth:`__getitem__` method. ++The primary must evaluate to an object that supports subscription (lists or ++dictionaries for example). User-defined objects can support subscription by ++defining a :meth:`__getitem__` method. + + For built-in objects, there are two types of objects that support subscription: + +@@ -660,8 +661,8 @@ + keyword_arguments: `keyword_item` ("," `keyword_item`)* + keyword_item: `identifier` "=" `expression` + +-A trailing comma may be present after the positional and keyword arguments but +-does not affect the semantics. ++An optional trailing comma may be present after the positional and keyword arguments ++but does not affect the semantics. + + .. index:: + single: parameter; call semantics +@@ -943,9 +944,9 @@ + .. index:: single: addition + + The ``+`` (addition) operator yields the sum of its arguments. The arguments +-must either both be numbers or both sequences of the same type. In the former +-case, the numbers are converted to a common type and then added together. In +-the latter case, the sequences are concatenated. ++must either both be numbers or both be sequences of the same type. In the ++former case, the numbers are converted to a common type and then added together. ++In the latter case, the sequences are concatenated. + + .. index:: single: subtraction + +@@ -1106,7 +1107,7 @@ + another one is made arbitrarily but consistently within one execution of a + program. + +-Comparison of objects of the differing types depends on whether either of the ++Comparison of objects of differing types depends on whether either of the + types provide explicit support for the comparison. Most numeric types can be + compared with one another. When cross-type comparison is not supported, the + comparison method returns ``NotImplemented``. +@@ -1116,7 +1117,7 @@ + The operators :keyword:`in` and :keyword:`not in` test for membership. ``x in + s`` evaluates to true if *x* is a member of *s*, and false otherwise. ``x not + in s`` returns the negation of ``x in s``. All built-in sequences and set types +-support this as well as dictionary, for which :keyword:`in` tests whether a the ++support this as well as dictionary, for which :keyword:`in` tests whether the + dictionary has a given key. For container types such as list, tuple, set, + frozenset, dict, or collections.deque, the expression ``x in y`` is equivalent + to ``any(x is e or x == e for e in y)``. +@@ -1202,9 +1203,9 @@ + they return to ``False`` and ``True``, but rather return the last evaluated + argument. This is sometimes useful, e.g., if ``s`` is a string that should be + replaced by a default value if it is empty, the expression ``s or 'foo'`` yields +-the desired value. Because :keyword:`not` has to invent a value anyway, it does +-not bother to return a value of the same type as its argument, so e.g., ``not +-'foo'`` yields ``False``, not ``''``.) ++the desired value. Because :keyword:`not` has to create a new value, it ++returns a boolean value regardless of the type of its argument ++(for example, ``not 'foo'`` produces ``False`` rather than ``''``.) + + + Conditional expressions +@@ -1222,8 +1223,8 @@ + Conditional expressions (sometimes called a "ternary operator") have the lowest + priority of all Python operations. + +-The expression ``x if C else y`` first evaluates the condition, *C* (*not* *x*); +-if *C* is true, *x* is evaluated and its value is returned; otherwise, *y* is ++The expression ``x if C else y`` first evaluates the condition, *C* rather than *x*. ++If *C* is true, *x* is evaluated and its value is returned; otherwise, *y* is + evaluated and its value is returned. + + See :pep:`308` for more details about conditional expressions. +@@ -1244,10 +1245,9 @@ + lambda_expr: "lambda" [`parameter_list`]: `expression` + lambda_expr_nocond: "lambda" [`parameter_list`]: `expression_nocond` + +-Lambda expressions (sometimes called lambda forms) have the same syntactic position as +-expressions. They are a shorthand to create anonymous functions; the expression +-``lambda arguments: expression`` yields a function object. The unnamed object +-behaves like a function object defined with :: ++Lambda expressions (sometimes called lambda forms) are used to create anonymous ++functions. The expression ``lambda arguments: expression`` yields a function ++object. The unnamed object behaves like a function object defined with :: + + def (arguments): + return expression +@@ -1310,13 +1310,15 @@ + + .. index:: pair: operator; precedence + +-The following table summarizes the operator precedences in Python, from lowest ++The following table summarizes the operator precedence in Python, from lowest + precedence (least binding) to highest precedence (most binding). Operators in + the same box have the same precedence. Unless the syntax is explicitly given, + operators are binary. Operators in the same box group left to right (except for +-comparisons, including tests, which all have the same precedence and chain from +-left to right --- see section :ref:`comparisons` --- and exponentiation, which +-groups from right to left). ++exponentiation, which groups from right to left). ++ ++Note that comparisons, membership tests, and identity tests, all have the same ++precedence and have a left-to-right chaining feature as described in the ++:ref:`comparisons` section. + + + +-----------------------------------------------+-------------------------------------+ +diff -r c0e311e010fc Doc/reference/simple_stmts.rst +--- a/Doc/reference/simple_stmts.rst ++++ b/Doc/reference/simple_stmts.rst +@@ -7,7 +7,7 @@ + + .. index:: pair: simple; statement + +-Simple statements are comprised within a single logical line. Several simple ++A simple statement is comprised within a single logical line. Several simple + statements may occur on a single line separated by semicolons. The syntax for + simple statements is: + +@@ -91,8 +91,8 @@ + : | `slicing` + : | "*" `target` + +-(See section :ref:`primaries` for the syntax definitions for the last three +-symbols.) ++(See section :ref:`primaries` for the syntax definitions for *attributeref*, ++*subscription*, and *slicing*.) + + An assignment statement evaluates the expression list (remember that this can be + a single expression or a comma-separated list, the latter yielding a tuple) and +@@ -228,7 +228,7 @@ + inclusive. Finally, the sequence object is asked to replace the slice with + the items of the assigned sequence. The length of the slice may be different + from the length of the assigned sequence, thus changing the length of the +- target sequence, if the object allows it. ++ target sequence, if the target sequence allows it. + + .. impl-detail:: + +@@ -236,14 +236,15 @@ + as for expressions, and invalid syntax is rejected during the code generation + phase, causing less detailed error messages. + +-WARNING: Although the definition of assignment implies that overlaps between the +-left-hand side and the right-hand side are 'safe' (for example ``a, b = b, a`` +-swaps two variables), overlaps *within* the collection of assigned-to variables +-are not safe! For instance, the following program prints ``[0, 2]``:: ++Although the definition of assignment implies that overlaps between the ++left-hand side and the right-hand side are 'simultanenous' (for example ``a, b = ++b, a`` swaps two variables), overlaps *within* the collection of assigned-to ++variables occur left-to-right, sometimes resulting in confusion. For instance, ++the following program prints ``[0, 2]``:: + + x = [0, 1] + i = 0 +- i, x[i] = 1, 2 ++ i, x[i] = 1, 2 # i is updated, then x[i] is updated + print(x) + + +@@ -283,7 +284,7 @@ + augop: "+=" | "-=" | "*=" | "/=" | "//=" | "%=" | "**=" + : | ">>=" | "<<=" | "&=" | "^=" | "|=" + +-(See section :ref:`primaries` for the syntax definitions for the last three ++(See section :ref:`primaries` for the syntax definitions of the last three + symbols.) + + An augmented assignment evaluates the target (which, unlike normal assignment +@@ -297,6 +298,11 @@ + is performed *in-place*, meaning that rather than creating a new object and + assigning that to the target, the old object is modified instead. + ++Unlike normal assignments, augmented assignments evaluate the left-hand side ++*before* evaluating the right-hand side. For example, ``a[i] += f(x)`` first ++looks-up ``a[i]``, then it evaluates ``f(x)`` and performs the addition, and ++lastly, it writes the result back to ``a[i]``. ++ + With the exception of assigning to tuples and multiple targets in a single + statement, the assignment done by augmented assignment statements is handled the + same way as normal assignments. Similarly, with the exception of the possible +@@ -658,7 +664,7 @@ + as though the clauses had been separated out into individiual import + statements. + +-The details of the first step, finding and loading modules is described in ++The details of the first step, finding and loading modules are described in + greater detail in the section on the :ref:`import system `, + which also describes the various types of packages and modules that can + be imported, as well as all the hooks that can be used to customize +@@ -689,7 +695,7 @@ + + The :keyword:`from` form uses a slightly more complex process: + +-#. find the module specified in the :keyword:`from` clause loading and ++#. find the module specified in the :keyword:`from` clause, loading and + initializing it if necessary; + #. for each of the identifiers specified in the :keyword:`import` clauses: + +@@ -697,7 +703,7 @@ + #. if not, attempt to import a submodule with that name and then + check the imported module again for that attribute + #. if the attribute is not found, :exc:`ImportError` is raised. +- #. otherwise, a reference to that value is bound in the local namespace, ++ #. otherwise, a reference to that value is stored in the local namespace, + using the name in the :keyword:`as` clause if it is present, + otherwise using the attribute name + +@@ -726,9 +732,9 @@ + library modules which were imported and used within the module). + + The :keyword:`from` form with ``*`` may only occur in a module scope. The wild +-card form of import --- ``import *`` --- is only allowed at the module level. +-Attempting to use it in class or function definitions will raise a +-:exc:`SyntaxError`. ++card form of import --- ``from module import *`` --- is only allowed at the ++module level. Attempting to use it in class or function definitions will raise ++a :exc:`SyntaxError`. + + .. index:: + single: relative; import +@@ -747,7 +753,7 @@ + The specification for relative imports is contained within :pep:`328`. + + :func:`importlib.import_module` is provided to support applications that +-determine which modules need to be loaded dynamically. ++determine dynamically the modules to be loaded. + + + .. _future: +@@ -759,10 +765,12 @@ + + A :dfn:`future statement` is a directive to the compiler that a particular + module should be compiled using syntax or semantics that will be available in a +-specified future release of Python. The future statement is intended to ease +-migration to future versions of Python that introduce incompatible changes to +-the language. It allows use of the new features on a per-module basis before +-the release in which the feature becomes standard. ++specified future release of Python where the feature becomes standard. ++ ++The future statement is intended to ease migration to future versions of Python ++that introduce incompatible changes to the language. It allows use of the new ++features on a per-module basis before the release in which the feature becomes ++standard. + + .. productionlist:: * + future_statement: "from" "__future__" "import" feature ["as" name] +@@ -857,7 +865,7 @@ + + .. impl-detail:: + +- The current implementation does not enforce the latter two restrictions, but ++ The current implementation does not enforce the two restrictions, but + programs should not abuse this freedom, as future implementations may enforce + them or silently change the meaning of the program. + +@@ -890,16 +898,16 @@ + : | "nonlocal" identifier augop expression_list + + The :keyword:`nonlocal` statement causes the listed identifiers to refer to +-previously bound variables in the nearest enclosing scope. This is important +-because the default behavior for binding is to search the local namespace +-first. The statement allows encapsulated code to rebind variables outside of +-the local scope besides the global (module) scope. ++previously bound variables in the nearest enclosing scope excluding globals. ++This is important because the default behavior for binding is to search the ++local namespace first. The statement allows encapsulated code to rebind ++variables outside of the local scope besides the global (module) scope. + + .. XXX not implemented + The :keyword:`nonlocal` statement may prepend an assignment or augmented + assignment, but not an expression. + +-Names listed in a :keyword:`nonlocal` statement, unlike to those listed in a ++Names listed in a :keyword:`nonlocal` statement, unlike those listed in a + :keyword:`global` statement, must refer to pre-existing bindings in an + enclosing scope (the scope in which a new binding should be created cannot + be determined unambiguously). +diff -r c0e311e010fc Doc/tools/sphinxext/indexsidebar.html +--- a/Doc/tools/sphinxext/indexsidebar.html ++++ b/Doc/tools/sphinxext/indexsidebar.html +@@ -1,17 +1,18 @@ +-

Download

+-

Download these documents

+-

Docs for other versions

+- ++

Download

++

Download these documents

++

Docs for other versions

++ + +-

Other resources

+- ++

Other resources

++ +diff -r c0e311e010fc Doc/tutorial/classes.rst +--- a/Doc/tutorial/classes.rst ++++ b/Doc/tutorial/classes.rst +@@ -387,6 +387,77 @@ + argument list. + + ++.. _tut-class-and-instance-variables: ++ ++Class and Instance Variables ++---------------------------- ++ ++Generally speaking, instance variables are for data unique to each instance ++and class variables are for attributes and methods shared by all instances ++of the class:: ++ ++ class Dog: ++ ++ kind = 'canine' # class variable shared by all instances ++ ++ def __init__(self, name): ++ self.name = name # instance variable unique to each instance ++ ++ >>> d = Dog('Fido') ++ >>> e = Dog('Buddy') ++ >>> d.kind # shared by all dogs ++ 'canine' ++ >>> e.kind # shared by all dogs ++ 'canine' ++ >>> d.name # unique to d ++ 'Fido' ++ >>> e.name # unique to e ++ 'Buddy' ++ ++As discussed in :ref:`tut-object`, shared data can have possibly surprising ++effects with involving :term:`mutable` objects such as lists and dictionaries. ++For example, the *tricks* list in the following code should not be used as a ++class variable because just a single list would be shared by all *Dog* ++instances:: ++ ++ class Dog: ++ ++ tricks = [] # mistaken use of a class variable ++ ++ def __init__(self, name): ++ self.name = name ++ ++ def add_trick(self, trick): ++ self.tricks.append(trick) ++ ++ >>> d = Dog('Fido') ++ >>> e = Dog('Buddy') ++ >>> d.add_trick('roll over') ++ >>> e.add_trick('play dead') ++ >>> d.tricks # unexpectedly shared by all dogs ++ ['roll over', 'play dead'] ++ ++Correct design of the class should use an instance variable instead:: ++ ++ class Dog: ++ ++ def __init__(self, name): ++ self.name = name ++ self.tricks = [] # creates a new empty list for each dog ++ ++ def add_trick(self, trick): ++ self.tricks.append(trick) ++ ++ >>> d = Dog('Fido') ++ >>> e = Dog('Buddy') ++ >>> d.add_trick('roll over') ++ >>> e.add_trick('play dead') ++ >>> d.tricks ++ ['roll over'] ++ >>> e.tricks ++ ['play dead'] ++ ++ + .. _tut-remarks: + + Random Remarks +diff -r c0e311e010fc Doc/tutorial/datastructures.rst +--- a/Doc/tutorial/datastructures.rst ++++ b/Doc/tutorial/datastructures.rst +@@ -111,10 +111,15 @@ + >>> a.sort() + >>> a + [-1, 1, 66.25, 333, 333, 1234.5] ++ >>> a.pop() ++ 1234.5 ++ >>> a ++ [-1, 1, 66.25, 333, 333] + + You might have noticed that methods like ``insert``, ``remove`` or ``sort`` that +-modify the list have no return value printed -- they return ``None``. [1]_ This +-is a design principle for all mutable data structures in Python. ++only modify the list have no return value printed -- they return the default ++``None``. [1]_ This is a design principle for all mutable data structures in ++Python. + + + .. _tut-lists-as-stacks: +diff -r c0e311e010fc Doc/tutorial/stdlib.rst +--- a/Doc/tutorial/stdlib.rst ++++ b/Doc/tutorial/stdlib.rst +@@ -40,7 +40,9 @@ + + >>> import shutil + >>> shutil.copyfile('data.db', 'archive.db') ++ 'archive.db' + >>> shutil.move('/build/executables', 'installdir') ++ 'installdir' + + + .. _tut-file-wildcards: +diff -r c0e311e010fc Doc/using/cmdline.rst +--- a/Doc/using/cmdline.rst ++++ b/Doc/using/cmdline.rst +@@ -616,8 +616,8 @@ + + .. envvar:: PYTHONASYNCIODEBUG + +- If this environment variable is set to a non-empty string, enable the debug +- mode of the :mod:`asyncio` module. ++ If this environment variable is set to a non-empty string, enable the ++ :ref:`debug mode ` of the :mod:`asyncio` module. + + .. versionadded:: 3.4 + +diff -r c0e311e010fc Doc/whatsnew/2.7.rst +--- a/Doc/whatsnew/2.7.rst ++++ b/Doc/whatsnew/2.7.rst +@@ -7,7 +7,6 @@ + .. hyperlink all the methods & functions. + + .. T_STRING_INPLACE not described in main docs +-.. "Format String Syntax" in string.rst could use many more examples. + + .. $Id$ + Rules for maintenance: +@@ -50,17 +49,16 @@ + This saves the maintainer some effort going through the SVN logs + when researching a change. + +-This article explains the new features in Python 2.7. The final +-release of 2.7 is currently scheduled for July 2010; the detailed +-schedule is described in :pep:`373`. ++This article explains the new features in Python 2.7. Python 2.7 was released ++on July 3, 2010. + + Numeric handling has been improved in many ways, for both +-floating-point numbers and for the :class:`Decimal` class. There are +-some useful additions to the standard library, such as a greatly +-enhanced :mod:`unittest` module, the :mod:`argparse` module for +-parsing command-line options, convenient ordered-dictionary and +-:class:`Counter` classes in the :mod:`collections` module, and many +-other improvements. ++floating-point numbers and for the :class:`~decimal.Decimal` class. ++There are some useful additions to the standard library, such as a ++greatly enhanced :mod:`unittest` module, the :mod:`argparse` module ++for parsing command-line options, convenient :class:`~collections.OrderedDict` ++and :class:`~collections.Counter` classes in the :mod:`collections` module, ++and many other improvements. + + Python 2.7 is planned to be the last of the 2.x releases, so we worked + on making it a good release for the long term. To help with porting +@@ -81,45 +79,91 @@ + The Future for Python 2.x + ========================= + +-Python 2.7 is intended to be the last major release in the 2.x series. +-The Python maintainers are planning to focus their future efforts on +-the Python 3.x series. +- +-This means that 2.7 will remain in place for a long time, running +-production systems that have not been ported to Python 3.x. +-Two consequences of the long-term significance of 2.7 are: +- +-* It's very likely the 2.7 release will have a longer period of +- maintenance compared to earlier 2.x versions. Python 2.7 will +- continue to be maintained while the transition to 3.x continues, and +- the developers are planning to support Python 2.7 with bug-fix +- releases beyond the typical two years. +- +-* A policy decision was made to silence warnings only of interest to +- developers. :exc:`DeprecationWarning` and its +- descendants are now ignored unless otherwise requested, preventing +- users from seeing warnings triggered by an application. This change +- was also made in the branch that will become Python 3.2. (Discussed +- on stdlib-sig and carried out in :issue:`7319`.) +- +- In previous releases, :exc:`DeprecationWarning` messages were +- enabled by default, providing Python developers with a clear +- indication of where their code may break in a future major version +- of Python. +- +- However, there are increasingly many users of Python-based +- applications who are not directly involved in the development of +- those applications. :exc:`DeprecationWarning` messages are +- irrelevant to such users, making them worry about an application +- that's actually working correctly and burdening application developers +- with responding to these concerns. +- +- You can re-enable display of :exc:`DeprecationWarning` messages by +- running Python with the :option:`-Wdefault` (short form: +- :option:`-Wd`) switch, or by setting the :envvar:`PYTHONWARNINGS` +- environment variable to ``"default"`` (or ``"d"``) before running +- Python. Python code can also re-enable them +- by calling ``warnings.simplefilter('default')``. ++Python 2.7 is the last major release in the 2.x series, as the Python ++maintainers have shifted the focus of their new feature development efforts ++to the Python 3.x series. This means that while Python 2 continues to ++receive bug fixes, and to be updated to build correctly on new hardware and ++versions of supported operated systems, there will be no new full feature ++releases for the language or standard library. ++ ++However, while there is a large common subset between Python 2.7 and Python ++3, and many of the changes involved in migrating to that common subset, or ++directly to Python 3, can be safely automated, some other changes (notably ++those associated with Unicode handling) may require careful consideration, ++and preferably robust automated regression test suites, to migrate ++effectively. ++ ++This means that Python 2.7 will remain in place for a long time, providing a ++stable and supported base platform for production systems that have not yet ++been ported to Python 3. The full expected lifecycle of the Python 2.7 ++series is detailed in :pep:`373`. ++ ++Some key consequences of the long-term significance of 2.7 are: ++ ++* As noted above, the 2.7 release has a much longer period of maintenance ++ when compared to earlier 2.x versions. Python 2.7 is currently expected to ++ remain supported by the core development team (receiving security updates ++ and other bug fixes) until at least 2020 (10 years after its initial ++ release, compared to the more typical support period of 18-24 months). ++ ++* As the Python 2.7 standard library ages, making effective use of the ++ Python Package Index (either directly or via a redistributor) becomes ++ more important for Python 2 users. In addition to a wide variety of third ++ party packages for various tasks, the available packages include backports ++ of new modules and features from the Python 3 standard library that are ++ compatible with Python 2, as well as various tools and libraries that can ++ make it easier to migrate to Python 3. The `Python Packaging User Guide ++ `__ provides guidance on downloading and ++ installing software from the Python Package Index. ++ ++* While the preferred approach to enhancing Python 2 is now the publication ++ of new packages on the Python Package Index, this approach doesn't ++ necessarily work in all cases, especially those related to network ++ security. In exceptional cases that cannot be handled adequately by ++ publishing new or updated packages on PyPI, the Python Enhancement ++ Proposal process may be used to make the case for adding new features ++ directly to the Python 2 standard library. Any such additions, and the ++ maintenance releases where they were added, will be noted in the ++ :ref:`py27-maintenance-enhancements` section below. ++ ++For projects wishing to migrate from Python 2 to Python 3, or for library ++and framework developers wishing to support users on both Python 2 and ++Python 3, there are a variety of tools and guides available to help decide ++on a suitable approach and manage some of the technical details involved. ++The recommended starting point is the :ref:`pyporting-howto` HOWTO guide. ++ ++ ++Changes to the Handling of Deprecation Warnings ++=============================================== ++ ++For Python 2.7, a policy decision was made to silence warnings only of ++interest to developers by default. :exc:`DeprecationWarning` and its ++descendants are now ignored unless otherwise requested, preventing ++users from seeing warnings triggered by an application. This change ++was also made in the branch that became Python 3.2. (Discussed ++on stdlib-sig and carried out in :issue:`7319`.) ++ ++In previous releases, :exc:`DeprecationWarning` messages were ++enabled by default, providing Python developers with a clear ++indication of where their code may break in a future major version ++of Python. ++ ++However, there are increasingly many users of Python-based ++applications who are not directly involved in the development of ++those applications. :exc:`DeprecationWarning` messages are ++irrelevant to such users, making them worry about an application ++that's actually working correctly and burdening application developers ++with responding to these concerns. ++ ++You can re-enable display of :exc:`DeprecationWarning` messages by ++running Python with the :option:`-Wdefault <-W>` (short form: ++:option:`-Wd <-W>`) switch, or by setting the :envvar:`PYTHONWARNINGS` ++environment variable to ``"default"`` (or ``"d"``) before running ++Python. Python code can also re-enable them ++by calling ``warnings.simplefilter('default')``. ++ ++The ``unittest`` module also automatically reenables deprecation warnings ++when running tests. + + + Python 3.1 Features +@@ -133,7 +177,7 @@ + A partial list of 3.1 features that were backported to 2.7: + + * The syntax for set literals (``{1,2,3}`` is a mutable set). +-* Dictionary and set comprehensions (``{ i: i*2 for i in range(3)}``). ++* Dictionary and set comprehensions (``{i: i*2 for i in range(3)}``). + * Multiple context managers in a single :keyword:`with` statement. + * A new version of the :mod:`io` library, rewritten in C for performance. + * The ordered-dictionary type described in :ref:`pep-0372`. +@@ -155,7 +199,7 @@ + * :func:`operator.isCallable` and :func:`operator.sequenceIncludes`, + which are not supported in 3.x, now trigger warnings. + * The :option:`-3` switch now automatically +- enables the :option:`-Qwarn` switch that causes warnings ++ enables the :option:`-Qwarn <-Q>` switch that causes warnings + about using classic division with integers and long integers. + + +@@ -390,9 +434,10 @@ + + .. seealso:: + +- `argparse module documentation `__ +- +- `Upgrading optparse code to use argparse `__ ++ :mod:`argparse` documentation ++ The documentation page of the argparse module. ++ ++ :ref:`upgrading-optparse-code` + Part of the Python documentation, describing how to convert + code that uses :mod:`optparse`. + +@@ -402,8 +447,6 @@ + PEP 391: Dictionary-Based Configuration For Logging + ==================================================== + +-.. XXX not documented in library reference yet; add link here once it's added. +- + The :mod:`logging` module is very flexible; applications can define + a tree of logging subsystems, and each logger in this tree can filter + out certain messages, format them differently, and direct messages to +@@ -412,21 +455,21 @@ + All this flexibility can require a lot of configuration. You can + write Python statements to create objects and set their properties, + but a complex set-up requires verbose but boring code. +-:mod:`logging` also supports a :func:`~logging.config.fileConfig` ++:mod:`logging` also supports a :func:`~logging.fileConfig` + function that parses a file, but the file format doesn't support + configuring filters, and it's messier to generate programmatically. + +-Python 2.7 adds a :func:`~logging.config.dictConfig` function that ++Python 2.7 adds a :func:`~logging.dictConfig` function that + uses a dictionary to configure logging. There are many ways to + produce a dictionary from different sources: construct one with code; + parse a file containing JSON; or use a YAML parsing library if one is +-installed. ++installed. For more information see :ref:`logging-config-api`. + + The following example configures two loggers, the root logger and a +-logger named "network". Messages sent to the root logger will be ++logger named "network". Messages sent to the root logger will be + sent to the system log using the syslog protocol, and messages + to the "network" logger will be written to a :file:`network.log` file +-that will be rotated once the log reaches 1Mb. ++that will be rotated once the log reaches 1MB. + + :: + +@@ -445,7 +488,7 @@ + 'filename': '/logs/network.log', + 'formatter': 'standard', + 'level': 'INFO', +- 'maxBytes': 1024*1024}, ++ 'maxBytes': 1000000}, + 'syslog': {'class': 'logging.handlers.SysLogHandler', + 'formatter': 'standard', + 'level': 'ERROR'}}, +@@ -483,16 +526,19 @@ + for UDP or :const:`socket.SOCK_STREAM` for TCP. The default + protocol remains UDP. + +-* :class:`Logger` instances gained a :meth:`getChild` method that retrieves a +- descendant logger using a relative path. For example, +- once you retrieve a logger by doing ``log = getLogger('app')``, ++* :class:`~logging.Logger` instances gained a :meth:`~logging.Logger.getChild` ++ method that retrieves a descendant logger using a relative path. ++ For example, once you retrieve a logger by doing ``log = getLogger('app')``, + calling ``log.getChild('network.listen')`` is equivalent to + ``getLogger('app.network.listen')``. + +-* The :class:`LoggerAdapter` class gained a :meth:`isEnabledFor` method +- that takes a *level* and returns whether the underlying logger would ++* The :class:`~logging.LoggerAdapter` class gained a ++ :meth:`~logging.LoggerAdapter.isEnabledFor` method that takes a ++ *level* and returns whether the underlying logger would + process a message of that level of importance. + ++.. XXX: Logger objects don't have a class declaration so the link don't work ++ + .. seealso:: + + :pep:`391` - Dictionary-Based Configuration For Logging +@@ -501,14 +547,15 @@ + PEP 3106: Dictionary Views + ==================================================== + +-The dictionary methods :meth:`keys`, :meth:`values`, and :meth:`items` +-are different in Python 3.x. They return an object called a :dfn:`view` +-instead of a fully materialized list. +- +-It's not possible to change the return values of :meth:`keys`, +-:meth:`values`, and :meth:`items` in Python 2.7 because too much code +-would break. Instead the 3.x versions were added under the new names +-:meth:`viewkeys`, :meth:`viewvalues`, and :meth:`viewitems`. ++The dictionary methods :meth:`~dict.keys`, :meth:`~dict.values`, and ++:meth:`~dict.items` are different in Python 3.x. They return an object ++called a :dfn:`view` instead of a fully materialized list. ++ ++It's not possible to change the return values of :meth:`~dict.keys`, ++:meth:`~dict.values`, and :meth:`~dict.items` in Python 2.7 because ++too much code would break. Instead the 3.x versions were added ++under the new names :meth:`~dict.viewkeys`, :meth:`~dict.viewvalues`, ++and :meth:`~dict.viewitems`. + + :: + +@@ -550,8 +597,8 @@ + RuntimeError: dictionary changed size during iteration + + You can use the view methods in Python 2.x code, and the 2to3 +-converter will change them to the standard :meth:`keys`, +-:meth:`values`, and :meth:`items` methods. ++converter will change them to the standard :meth:`~dict.keys`, ++:meth:`~dict.values`, and :meth:`~dict.items` methods. + + .. seealso:: + +@@ -624,7 +671,7 @@ + ``{}`` continues to represent an empty dictionary; use + ``set()`` for an empty set. + +- >>> {1,2,3,4,5} ++ >>> {1, 2, 3, 4, 5} + set([1, 2, 3, 4, 5]) + >>> set() # empty set + set([]) +@@ -794,7 +841,7 @@ + ``None`` as its first argument. (Fixed by Georg Brandl; + :issue:`4759`.) + +- .. bytearray doesn't seem to be documented ++ .. XXX bytearray doesn't seem to be documented + + * When using ``@classmethod`` and ``@staticmethod`` to wrap + methods as class or static methods, the wrapper object now +@@ -867,12 +914,6 @@ + + Several performance enhancements have been added: + +-.. * A new :program:`configure` option, :option:`--with-computed-gotos`, +- compiles the main bytecode interpreter loop using a new dispatch +- mechanism that gives speedups of up to 20%, depending on the system +- and benchmark. The new mechanism is only supported on certain +- compilers, such as gcc, SunPro, and icc. +- + * A new opcode was added to perform the initial setup for + :keyword:`with` statements, looking up the :meth:`__enter__` and + :meth:`__exit__` methods. (Contributed by Benjamin Peterson.) +@@ -1054,7 +1095,7 @@ + :meth:`~collections.deque.count` method that returns the number of + contained elements equal to the supplied argument *x*, and a + :meth:`~collections.deque.reverse` method that reverses the elements +- of the deque in-place. :class:`deque` also exposes its maximum ++ of the deque in-place. :class:`~collections.deque` also exposes its maximum + length as the read-only :attr:`~collections.deque.maxlen` attribute. + (Both features added by Raymond Hettinger.) + +@@ -1135,15 +1176,14 @@ + ``Decimal('0.1000000000000000055511151231257827021181583404541015625')``. + (Implemented by Raymond Hettinger; :issue:`4796`.) + +- Comparing instances of :class:`Decimal` with floating-point ++ Comparing instances of :class:`~decimal.Decimal` with floating-point + numbers now produces sensible results based on the numeric values + of the operands. Previously such comparisons would fall back to + Python's default rules for comparing objects, which produced arbitrary + results based on their type. Note that you still cannot combine + :class:`Decimal` and floating-point in other operations such as addition, + since you should be explicitly choosing how to convert between float and +- :class:`Decimal`. +- (Fixed by Mark Dickinson; :issue:`2531`.) ++ :class:`~decimal.Decimal`. (Fixed by Mark Dickinson; :issue:`2531`.) + + The constructor for :class:`~decimal.Decimal` now accepts + floating-point numbers (added by Raymond Hettinger; :issue:`8257`) +@@ -1195,8 +1235,8 @@ + + Ordering comparisons (``<``, ``<=``, ``>``, ``>=``) between + fractions and complex numbers now raise a :exc:`TypeError`. +- This fixes an oversight, making the :class:`Fraction` match the other +- numeric types. ++ This fixes an oversight, making the :class:`~fractions.Fraction` ++ match the other numeric types. + + .. revision 79455 + +@@ -1210,7 +1250,7 @@ + uploads thanks to an added *rest* parameter (patch by Pablo Mouzo; + :issue:`6845`.) + +-* New class decorator: :func:`total_ordering` in the :mod:`functools` ++* New class decorator: :func:`~functools.total_ordering` in the :mod:`functools` + module takes a class that defines an :meth:`__eq__` method and one of + :meth:`__lt__`, :meth:`__le__`, :meth:`__gt__`, or :meth:`__ge__`, + and generates the missing comparison methods. Since the +@@ -1218,7 +1258,7 @@ + this decorator makes it easier to define ordered classes. + (Added by Raymond Hettinger; :issue:`5479`.) + +- New function: :func:`cmp_to_key` will take an old-style comparison ++ New function: :func:`~functools.cmp_to_key` will take an old-style comparison + function that expects two arguments and return a new callable that + can be used as the *key* parameter to functions such as + :func:`sorted`, :func:`min` and :func:`max`, etc. The primary +@@ -1345,7 +1385,7 @@ + with any object literal that decodes to a list of pairs. + (Contributed by Raymond Hettinger; :issue:`5381`.) + +-* The :mod:`mailbox` module's :class:`Maildir` class now records the ++* The :mod:`mailbox` module's :class:`~mailbox.Maildir` class now records the + timestamp on the directories it reads, and only re-reads them if the + modification time has subsequently changed. This improves + performance by avoiding unneeded directory scans. (Fixed by +@@ -1432,7 +1472,7 @@ + * The :mod:`signal` module no longer re-installs the signal handler + unless this is truly necessary, which fixes a bug that could make it + impossible to catch the EINTR signal robustly. (Fixed by +- Charles-François Natali; :issue:`8354`.) ++ Charles-Francois Natali; :issue:`8354`.) + + * New functions: in the :mod:`site` module, three new functions + return various site- and user-specific paths. +@@ -1466,10 +1506,10 @@ + defaults to False; if overridden to be True, + new request connections will have the TCP_NODELAY option set to + prevent buffering many small sends into a single TCP packet. +- The :attr:`~SocketServer.TCPServer.timeout` class attribute can hold ++ The :attr:`~SocketServer.BaseServer.timeout` class attribute can hold + a timeout in seconds that will be applied to the request socket; if +- no request is received within that time, :meth:`handle_timeout` +- will be called and :meth:`handle_request` will return. ++ no request is received within that time, :meth:`~SocketServer.BaseServer.handle_timeout` ++ will be called and :meth:`~SocketServer.BaseServer.handle_request` will return. + (Contributed by Kristján Valur Jónsson; :issue:`6192` and :issue:`6267`.) + + * Updated module: the :mod:`sqlite3` module has been updated to +@@ -1479,7 +1519,7 @@ + and then call :meth:`~sqlite3.Connection.load_extension` to load a particular shared library. + (Updated by Gerhard Häring.) + +-* The :mod:`ssl` module's :class:`ssl.SSLSocket` objects now support the ++* The :mod:`ssl` module's :class:`~ssl.SSLSocket` objects now support the + buffer API, which fixed a test suite failure (fix by Antoine Pitrou; + :issue:`7133`) and automatically set + OpenSSL's :c:macro:`SSL_MODE_AUTO_RETRY`, which will prevent an error +@@ -1535,7 +1575,7 @@ + on receiving an :const:`EINTR` signal. (Reported by several people; final + patch by Gregory P. Smith in :issue:`1068268`.) + +-* New function: :func:`~symtable.is_declared_global` in the :mod:`symtable` module ++* New function: :func:`~symtable.Symbol.is_declared_global` in the :mod:`symtable` module + returns true for variables that are explicitly declared to be global, + false for ones that are implicitly global. + (Contributed by Jeremy Hylton.) +@@ -1716,7 +1756,7 @@ + Makefile and the :file:`pyconfig.h` file. + * :func:`~sysconfig.get_config_vars` returns a dictionary containing + all of the configuration variables. +-* :func:`~sysconfig.getpath` returns the configured path for ++* :func:`~sysconfig.get_path` returns the configured path for + a particular type of module: the standard library, + site-specific modules, platform-specific modules, etc. + * :func:`~sysconfig.is_python_build` returns true if you're running a +@@ -1778,7 +1818,7 @@ + Consult the :mod:`unittest` module documentation for more details. + (Developed in :issue:`6001`.) + +-The :func:`main` function supports some other new options: ++The :func:`~unittest.main` function supports some other new options: + + * :option:`-b` or :option:`--buffer` will buffer the standard output + and standard error streams during each test. If the test passes, +@@ -1796,7 +1836,7 @@ + being tested or the tests being run have defined a signal handler of + their own, by noticing that a signal handler was already set and + calling it. If this doesn't work for you, there's a +- :func:`removeHandler` decorator that can be used to mark tests that ++ :func:`~unittest.removeHandler` decorator that can be used to mark tests that + should have the control-C handling disabled. + + * :option:`-f` or :option:`--failfast` makes +@@ -1923,7 +1963,7 @@ + + :func:`unittest.main` now takes an optional ``exit`` argument. If + False, :func:`~unittest.main` doesn't call :func:`sys.exit`, allowing +-:func:`main` to be used from the interactive interpreter. ++:func:`~unittest.main` to be used from the interactive interpreter. + (Contributed by J. Pablo Fernández; :issue:`3379`.) + + :class:`~unittest.TestResult` has new :meth:`~unittest.TestResult.startTestRun` and +@@ -2120,7 +2160,7 @@ + :c:macro:`Py_ISSPACE`, + :c:macro:`Py_ISUPPER`, + :c:macro:`Py_ISXDIGIT`, +- and :c:macro:`Py_TOLOWER`, :c:macro:`Py_TOUPPER`. ++ :c:macro:`Py_TOLOWER`, and :c:macro:`Py_TOUPPER`. + All of these functions are analogous to the C + standard macros for classifying characters, but ignore the current + locale setting, because in +@@ -2266,11 +2306,11 @@ + (Contributed by David Cournapeau; :issue:`4365`.) + + * The :mod:`_winreg` module for accessing the registry now implements +- the :func:`CreateKeyEx` and :func:`DeleteKeyEx` functions, extended +- versions of previously-supported functions that take several extra +- arguments. The :func:`DisableReflectionKey`, +- :func:`EnableReflectionKey`, and :func:`QueryReflectionKey` were also +- tested and documented. ++ the :func:`~_winreg.CreateKeyEx` and :func:`~_winreg.DeleteKeyEx` ++ functions, extended versions of previously-supported functions that ++ take several extra arguments. The :func:`~_winreg.DisableReflectionKey`, ++ :func:`~_winreg.EnableReflectionKey`, and :func:`~_winreg.QueryReflectionKey` ++ were also tested and documented. + (Implemented by Brian Curtin: :issue:`7347`.) + + * The new :c:func:`_beginthreadex` API is used to start threads, and +@@ -2329,7 +2369,7 @@ + attributes of the resulting code objects are overwritten when the + original filename is obsolete. This can happen if the file has been + renamed, moved, or is accessed through different paths. (Patch by +- Žiga Seilnacht and Jean-Paul Calderone; :issue:`1180193`.) ++ Ziga Seilnacht and Jean-Paul Calderone; :issue:`1180193`.) + + * The :file:`regrtest.py` script now takes a :option:`--randseed=` + switch that takes an integer that will be used as the random seed +@@ -2387,20 +2427,20 @@ + + In the standard library: + +-* Operations with :class:`datetime` instances that resulted in a year ++* Operations with :class:`~datetime.datetime` instances that resulted in a year + falling outside the supported range didn't always raise + :exc:`OverflowError`. Such errors are now checked more carefully + and will now raise the exception. (Reported by Mark Leander, patch + by Anand B. Pillai and Alexander Belopolsky; :issue:`7150`.) + +-* When using :class:`Decimal` instances with a string's ++* When using :class:`~decimal.Decimal` instances with a string's + :meth:`format` method, the default alignment was previously + left-alignment. This has been changed to right-alignment, which might + change the output of your programs. + (Changed by Mark Dickinson; :issue:`6857`.) + + Comparisons involving a signaling NaN value (or ``sNAN``) now signal +- :const:`InvalidOperation` instead of silently returning a true or ++ :const:`~decimal.InvalidOperation` instead of silently returning a true or + false value depending on the comparison operator. Quiet NaN values + (or ``NaN``) are now hashable. (Fixed by Mark Dickinson; + :issue:`7279`.) +@@ -2411,7 +2451,7 @@ + or comment (which looks like ``). + (Patch by Neil Muller; :issue:`2746`.) + +-* The :meth:`readline` method of :class:`StringIO` objects now does ++* The :meth:`~StringIO.StringIO.readline` method of :class:`~StringIO.StringIO` objects now does + nothing when a negative length is requested, as other file-like + objects do. (:issue:`7348`). + +@@ -2470,6 +2510,54 @@ + .. ====================================================================== + + ++.. _py27-maintenance-enhancements: ++ ++New Features Added to Python 2.7 Maintenance Releases ++===================================================== ++ ++New features may be added to Python 2.7 maintenance releases when the ++situation genuinely calls for it. Any such additions must go through ++the Python Enhancement Proposal process, and make a compelling case for why ++they can't be adequately addressed by either adding the new feature solely to ++Python 3, or else by publishing it on the Python Package Index. ++ ++In addition to the specific proposals listed below, there is a general ++exemption allowing new ``-3`` warnings to be added in any Python 2.7 ++maintenance release. ++ ++ ++PEP 434: IDLE Enhancement Exception for All Branches ++---------------------------------------------------- ++ ++:pep:`434` describes a general exemption for changes made to the IDLE ++development environment shipped along with Python. This exemption makes it ++possible for the IDLE developers to provide a more consistent user ++experience across all supported versions of Python 2 and 3. ++ ++For details of any IDLE changes, refer to the NEWS file for the specific ++release. ++ ++ ++PEP 466: Network Security Enhancements for Python 2.7 ++----------------------------------------------------- ++ ++:pep:`466` describes a number of network security enhancement proposals ++that have been approved for inclusion in Python 2.7 maintenance releases, ++with the first of those changes appearing in the Python 2.7.7 release. ++ ++:pep:`466` related features added in Python 2.7.7: ++ ++* :func:`hmac.compare_digest` was added to make a timing attack resistant ++ comparison operation broadly available to Python 2 applications ++ (backported by Alex Gaynor in :issue:`21306`) ++ ++* the version of OpenSSL linked with the prebuilt Windows installers ++ published on python.org was updated to 1.0.1g (contributed by ++ Zachary Ware in :issue:`21462`) ++ ++ ++.. ====================================================================== ++ + .. _acks27: + + Acknowledgements +diff -r c0e311e010fc Include/listobject.h +--- a/Include/listobject.h ++++ b/Include/listobject.h +@@ -46,7 +46,7 @@ + PyAPI_DATA(PyTypeObject) PySortWrapper_Type; + + #define PyList_Check(op) \ +- PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_LIST_SUBCLASS) ++ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_LIST_SUBCLASS) + #define PyList_CheckExact(op) (Py_TYPE(op) == &PyList_Type) + + PyAPI_FUNC(PyObject *) PyList_New(Py_ssize_t size); +diff -r c0e311e010fc Lib/_collections_abc.py +--- a/Lib/_collections_abc.py ++++ b/Lib/_collections_abc.py +@@ -183,7 +183,7 @@ + methods except for __contains__, __iter__ and __len__. + + To override the comparisons (presumably for speed, as the +- semantics are fixed), all you have to do is redefine __le__ and ++ semantics are fixed), redefine __le__ and __ge__, + then the other operations will automatically follow suit. + """ + +@@ -207,12 +207,17 @@ + def __gt__(self, other): + if not isinstance(other, Set): + return NotImplemented +- return other.__lt__(self) ++ return len(self) > len(other) and self.__ge__(other) + + def __ge__(self, other): + if not isinstance(other, Set): + return NotImplemented +- return other.__le__(self) ++ if len(self) < len(other): ++ return False ++ for elem in other: ++ if elem not in self: ++ return False ++ return True + + def __eq__(self, other): + if not isinstance(other, Set): +@@ -236,6 +241,8 @@ + return NotImplemented + return self._from_iterable(value for value in other if value in self) + ++ __rand__ = __and__ ++ + def isdisjoint(self, other): + 'Return True if two sets have a null intersection.' + for value in other: +@@ -249,6 +256,8 @@ + chain = (e for s in (self, other) for e in s) + return self._from_iterable(chain) + ++ __ror__ = __or__ ++ + def __sub__(self, other): + if not isinstance(other, Set): + if not isinstance(other, Iterable): +@@ -257,6 +266,14 @@ + return self._from_iterable(value for value in self + if value not in other) + ++ def __rsub__(self, other): ++ if not isinstance(other, Set): ++ if not isinstance(other, Iterable): ++ return NotImplemented ++ other = self._from_iterable(other) ++ return self._from_iterable(value for value in other ++ if value not in self) ++ + def __xor__(self, other): + if not isinstance(other, Set): + if not isinstance(other, Iterable): +@@ -264,6 +281,8 @@ + other = self._from_iterable(other) + return (self - other) | (other - self) + ++ __rxor__ = __xor__ ++ + def _hash(self): + """Compute the hash value of a set. + +diff -r c0e311e010fc Lib/_osx_support.py +--- a/Lib/_osx_support.py ++++ b/Lib/_osx_support.py +@@ -450,8 +450,16 @@ + # case and disallow installs. + cflags = _config_vars.get(_INITPRE+'CFLAGS', + _config_vars.get('CFLAGS', '')) +- if ((macrelease + '.') >= '10.4.' and +- '-arch' in cflags.strip()): ++ if macrelease: ++ try: ++ macrelease = tuple(int(i) for i in macrelease.split('.')[0:2]) ++ except ValueError: ++ macrelease = (10, 0) ++ else: ++ # assume no universal support ++ macrelease = (10, 0) ++ ++ if (macrelease >= (10, 4)) and '-arch' in cflags.strip(): + # The universal build will build fat binaries, but not on + # systems before 10.4 + +diff -r c0e311e010fc Lib/_pyio.py +--- a/Lib/_pyio.py ++++ b/Lib/_pyio.py +@@ -200,38 +200,45 @@ + (appending and "a" or "") + + (updating and "+" or ""), + closefd, opener=opener) +- line_buffering = False +- if buffering == 1 or buffering < 0 and raw.isatty(): +- buffering = -1 +- line_buffering = True +- if buffering < 0: +- buffering = DEFAULT_BUFFER_SIZE +- try: +- bs = os.fstat(raw.fileno()).st_blksize +- except (OSError, AttributeError): +- pass ++ result = raw ++ try: ++ line_buffering = False ++ if buffering == 1 or buffering < 0 and raw.isatty(): ++ buffering = -1 ++ line_buffering = True ++ if buffering < 0: ++ buffering = DEFAULT_BUFFER_SIZE ++ try: ++ bs = os.fstat(raw.fileno()).st_blksize ++ except (OSError, AttributeError): ++ pass ++ else: ++ if bs > 1: ++ buffering = bs ++ if buffering < 0: ++ raise ValueError("invalid buffering size") ++ if buffering == 0: ++ if binary: ++ return result ++ raise ValueError("can't have unbuffered text I/O") ++ if updating: ++ buffer = BufferedRandom(raw, buffering) ++ elif creating or writing or appending: ++ buffer = BufferedWriter(raw, buffering) ++ elif reading: ++ buffer = BufferedReader(raw, buffering) + else: +- if bs > 1: +- buffering = bs +- if buffering < 0: +- raise ValueError("invalid buffering size") +- if buffering == 0: ++ raise ValueError("unknown mode: %r" % mode) ++ result = buffer + if binary: +- return raw +- raise ValueError("can't have unbuffered text I/O") +- if updating: +- buffer = BufferedRandom(raw, buffering) +- elif creating or writing or appending: +- buffer = BufferedWriter(raw, buffering) +- elif reading: +- buffer = BufferedReader(raw, buffering) +- else: +- raise ValueError("unknown mode: %r" % mode) +- if binary: +- return buffer +- text = TextIOWrapper(buffer, encoding, errors, newline, line_buffering) +- text.mode = mode +- return text ++ return result ++ text = TextIOWrapper(buffer, encoding, errors, newline, line_buffering) ++ result = text ++ text.mode = mode ++ return result ++ except: ++ result.close() ++ raise + + + class DocDescriptor: +diff -r c0e311e010fc Lib/argparse.py +--- a/Lib/argparse.py ++++ b/Lib/argparse.py +@@ -1198,9 +1198,13 @@ + setattr(self, name, kwargs[name]) + + def __eq__(self, other): ++ if not isinstance(other, Namespace): ++ return NotImplemented + return vars(self) == vars(other) + + def __ne__(self, other): ++ if not isinstance(other, Namespace): ++ return NotImplemented + return not (self == other) + + def __contains__(self, key): +diff -r c0e311e010fc Lib/asynchat.py +--- a/Lib/asynchat.py ++++ b/Lib/asynchat.py +@@ -49,22 +49,22 @@ + from collections import deque + + +-class async_chat (asyncore.dispatcher): ++class async_chat(asyncore.dispatcher): + """This is an abstract class. You must derive from this class, and add + the two methods collect_incoming_data() and found_terminator()""" + + # these are overridable defaults + +- ac_in_buffer_size = 65536 +- ac_out_buffer_size = 65536 ++ ac_in_buffer_size = 65536 ++ ac_out_buffer_size = 65536 + + # we don't want to enable the use of encoding by default, because that is a + # sign of an application bug that we don't want to pass silently + +- use_encoding = 0 +- encoding = 'latin-1' ++ use_encoding = 0 ++ encoding = 'latin-1' + +- def __init__ (self, sock=None, map=None): ++ def __init__(self, sock=None, map=None): + # for string terminator matching + self.ac_in_buffer = b'' + +@@ -76,7 +76,7 @@ + # we toss the use of the "simple producer" and replace it with + # a pure deque, which the original fifo was a wrapping of + self.producer_fifo = deque() +- asyncore.dispatcher.__init__ (self, sock, map) ++ asyncore.dispatcher.__init__(self, sock, map) + + def collect_incoming_data(self, data): + raise NotImplementedError("must be implemented in subclass") +@@ -92,13 +92,18 @@ + def found_terminator(self): + raise NotImplementedError("must be implemented in subclass") + +- def set_terminator (self, term): +- "Set the input delimiter. Can be a fixed string of any length, an integer, or None" ++ def set_terminator(self, term): ++ """Set the input delimiter. ++ ++ Can be a fixed string of any length, an integer, or None. ++ """ + if isinstance(term, str) and self.use_encoding: + term = bytes(term, self.encoding) ++ elif isinstance(term, int) and term < 0: ++ raise ValueError('the number of received bytes must be positive') + self.terminator = term + +- def get_terminator (self): ++ def get_terminator(self): + return self.terminator + + # grab some more data from the socket, +@@ -106,10 +111,12 @@ + # check for the terminator, + # if found, transition to the next state. + +- def handle_read (self): ++ def handle_read(self): + + try: +- data = self.recv (self.ac_in_buffer_size) ++ data = self.recv(self.ac_in_buffer_size) ++ except BlockingIOError: ++ return + except OSError as why: + self.handle_error() + return +@@ -128,17 +135,17 @@ + terminator = self.get_terminator() + if not terminator: + # no terminator, collect it all +- self.collect_incoming_data (self.ac_in_buffer) ++ self.collect_incoming_data(self.ac_in_buffer) + self.ac_in_buffer = b'' + elif isinstance(terminator, int): + # numeric terminator + n = terminator + if lb < n: +- self.collect_incoming_data (self.ac_in_buffer) ++ self.collect_incoming_data(self.ac_in_buffer) + self.ac_in_buffer = b'' + self.terminator = self.terminator - lb + else: +- self.collect_incoming_data (self.ac_in_buffer[:n]) ++ self.collect_incoming_data(self.ac_in_buffer[:n]) + self.ac_in_buffer = self.ac_in_buffer[n:] + self.terminator = 0 + self.found_terminator() +@@ -155,32 +162,37 @@ + if index != -1: + # we found the terminator + if index > 0: +- # don't bother reporting the empty string (source of subtle bugs) +- self.collect_incoming_data (self.ac_in_buffer[:index]) ++ # don't bother reporting the empty string ++ # (source of subtle bugs) ++ self.collect_incoming_data(self.ac_in_buffer[:index]) + self.ac_in_buffer = self.ac_in_buffer[index+terminator_len:] +- # This does the Right Thing if the terminator is changed here. ++ # This does the Right Thing if the terminator ++ # is changed here. + self.found_terminator() + else: + # check for a prefix of the terminator +- index = find_prefix_at_end (self.ac_in_buffer, terminator) ++ index = find_prefix_at_end(self.ac_in_buffer, terminator) + if index: + if index != lb: + # we found a prefix, collect up to the prefix +- self.collect_incoming_data (self.ac_in_buffer[:-index]) ++ self.collect_incoming_data(self.ac_in_buffer[:-index]) + self.ac_in_buffer = self.ac_in_buffer[-index:] + break + else: + # no prefix, collect it all +- self.collect_incoming_data (self.ac_in_buffer) ++ self.collect_incoming_data(self.ac_in_buffer) + self.ac_in_buffer = b'' + +- def handle_write (self): ++ def handle_write(self): + self.initiate_send() + +- def handle_close (self): ++ def handle_close(self): + self.close() + +- def push (self, data): ++ def push(self, data): ++ if not isinstance(data, (bytes, bytearray, memoryview)): ++ raise TypeError('data argument must be byte-ish (%r)', ++ type(data)) + sabs = self.ac_out_buffer_size + if len(data) > sabs: + for i in range(0, len(data), sabs): +@@ -189,11 +201,11 @@ + self.producer_fifo.append(data) + self.initiate_send() + +- def push_with_producer (self, producer): ++ def push_with_producer(self, producer): + self.producer_fifo.append(producer) + self.initiate_send() + +- def readable (self): ++ def readable(self): + "predicate for inclusion in the readable for select()" + # cannot use the old predicate, it violates the claim of the + # set_terminator method. +@@ -201,11 +213,11 @@ + # return (len(self.ac_in_buffer) <= self.ac_in_buffer_size) + return 1 + +- def writable (self): ++ def writable(self): + "predicate for inclusion in the writable for select()" + return self.producer_fifo or (not self.connected) + +- def close_when_done (self): ++ def close_when_done(self): + "automatically close this channel once the outgoing queue is empty" + self.producer_fifo.append(None) + +@@ -216,10 +228,8 @@ + if not first: + del self.producer_fifo[0] + if first is None: +- ## print("first is None") + self.handle_close() + return +- ## print("first is not None") + + # handle classic producer behavior + obs = self.ac_out_buffer_size +@@ -251,20 +261,21 @@ + # we tried to send some actual data + return + +- def discard_buffers (self): ++ def discard_buffers(self): + # Emergencies only! + self.ac_in_buffer = b'' + del self.incoming[:] + self.producer_fifo.clear() + ++ + class simple_producer: + +- def __init__ (self, data, buffer_size=512): ++ def __init__(self, data, buffer_size=512): + self.data = data + self.buffer_size = buffer_size + +- def more (self): +- if len (self.data) > self.buffer_size: ++ def more(self): ++ if len(self.data) > self.buffer_size: + result = self.data[:self.buffer_size] + self.data = self.data[self.buffer_size:] + return result +@@ -273,38 +284,40 @@ + self.data = b'' + return result + ++ + class fifo: +- def __init__ (self, list=None): ++ def __init__(self, list=None): + if not list: + self.list = deque() + else: + self.list = deque(list) + +- def __len__ (self): ++ def __len__(self): + return len(self.list) + +- def is_empty (self): ++ def is_empty(self): + return not self.list + +- def first (self): ++ def first(self): + return self.list[0] + +- def push (self, data): ++ def push(self, data): + self.list.append(data) + +- def pop (self): ++ def pop(self): + if self.list: + return (1, self.list.popleft()) + else: + return (0, None) + ++ + # Given 'haystack', see if any prefix of 'needle' is at its end. This + # assumes an exact match has already been checked. Return the number of + # characters matched. + # for example: +-# f_p_a_e ("qwerty\r", "\r\n") => 1 +-# f_p_a_e ("qwertydkjf", "\r\n") => 0 +-# f_p_a_e ("qwerty\r\n", "\r\n") => ++# f_p_a_e("qwerty\r", "\r\n") => 1 ++# f_p_a_e("qwertydkjf", "\r\n") => 0 ++# f_p_a_e("qwerty\r\n", "\r\n") => + + # this could maybe be made faster with a computed regex? + # [answer: no; circa Python-2.0, Jan 2001] +@@ -313,7 +326,7 @@ + # re: 12820/s + # regex: 14035/s + +-def find_prefix_at_end (haystack, needle): ++def find_prefix_at_end(haystack, needle): + l = len(needle) - 1 + while l and not haystack.endswith(needle[:l]): + l -= 1 +diff -r c0e311e010fc Lib/asyncio/__init__.py +--- a/Lib/asyncio/__init__.py ++++ b/Lib/asyncio/__init__.py +@@ -18,6 +18,7 @@ + import _overlapped # Will also be exported. + + # This relies on each of the submodules having an __all__ variable. ++from .coroutines import * + from .events import * + from .futures import * + from .locks import * +@@ -28,13 +29,8 @@ + from .tasks import * + from .transports import * + +-if sys.platform == 'win32': # pragma: no cover +- from .windows_events import * +-else: +- from .unix_events import * # pragma: no cover +- +- +-__all__ = (events.__all__ + ++__all__ = (coroutines.__all__ + ++ events.__all__ + + futures.__all__ + + locks.__all__ + + protocols.__all__ + +@@ -43,3 +39,10 @@ + subprocess.__all__ + + tasks.__all__ + + transports.__all__) ++ ++if sys.platform == 'win32': # pragma: no cover ++ from .windows_events import * ++ __all__ += windows_events.__all__ ++else: ++ from .unix_events import * # pragma: no cover ++ __all__ += unix_events.__all__ +diff -r c0e311e010fc Lib/asyncio/base_events.py +--- a/Lib/asyncio/base_events.py ++++ b/Lib/asyncio/base_events.py +@@ -1,7 +1,7 @@ + """Base implementation of event loop. + + The event loop can be broken up into a multiplexer (the part +-responsible for notifying us of IO events) and the event loop proper, ++responsible for notifying us of I/O events) and the event loop proper, + which wraps a multiplexer with functionality for scheduling callbacks, + immediately or at a given time in the future. + +@@ -17,16 +17,20 @@ + import collections + import concurrent.futures + import heapq ++import inspect + import logging ++import os + import socket + import subprocess + import time +-import os ++import traceback + import sys + ++from . import coroutines + from . import events + from . import futures + from . import tasks ++from .coroutines import coroutine + from .log import logger + + +@@ -37,6 +41,24 @@ + _MAX_WORKERS = 5 + + ++def _format_handle(handle): ++ cb = handle._callback ++ if inspect.ismethod(cb) and isinstance(cb.__self__, tasks.Task): ++ # format the task ++ return repr(cb.__self__) ++ else: ++ return str(handle) ++ ++ ++def _format_pipe(fd): ++ if fd == subprocess.PIPE: ++ return '' ++ elif fd == subprocess.STDOUT: ++ return '' ++ else: ++ return repr(fd) ++ ++ + class _StopError(BaseException): + """Raised to stop the event loop.""" + +@@ -57,7 +79,7 @@ + type_mask |= socket.SOCK_NONBLOCK + if hasattr(socket, 'SOCK_CLOEXEC'): + type_mask |= socket.SOCK_CLOEXEC +- # Use getaddrinfo(AI_NUMERICHOST) to ensure that the address is ++ # Use getaddrinfo(flags=AI_NUMERICHOST) to ensure that the address is + # already resolved. + try: + socket.getaddrinfo(host, port, +@@ -76,49 +98,54 @@ + class Server(events.AbstractServer): + + def __init__(self, loop, sockets): +- self.loop = loop ++ self._loop = loop + self.sockets = sockets +- self.active_count = 0 +- self.waiters = [] ++ self._active_count = 0 ++ self._waiters = [] + +- def attach(self, transport): ++ def __repr__(self): ++ return '<%s sockets=%r>' % (self.__class__.__name__, self.sockets) ++ ++ def _attach(self): + assert self.sockets is not None +- self.active_count += 1 ++ self._active_count += 1 + +- def detach(self, transport): +- assert self.active_count > 0 +- self.active_count -= 1 +- if self.active_count == 0 and self.sockets is None: ++ def _detach(self): ++ assert self._active_count > 0 ++ self._active_count -= 1 ++ if self._active_count == 0 and self.sockets is None: + self._wakeup() + + def close(self): + sockets = self.sockets +- if sockets is not None: +- self.sockets = None +- for sock in sockets: +- self.loop._stop_serving(sock) +- if self.active_count == 0: +- self._wakeup() ++ if sockets is None: ++ return ++ self.sockets = None ++ for sock in sockets: ++ self._loop._stop_serving(sock) ++ if self._active_count == 0: ++ self._wakeup() + + def _wakeup(self): +- waiters = self.waiters +- self.waiters = None ++ waiters = self._waiters ++ self._waiters = None + for waiter in waiters: + if not waiter.done(): + waiter.set_result(waiter) + +- @tasks.coroutine ++ @coroutine + def wait_closed(self): +- if self.sockets is None or self.waiters is None: ++ if self.sockets is None or self._waiters is None: + return +- waiter = futures.Future(loop=self.loop) +- self.waiters.append(waiter) ++ waiter = futures.Future(loop=self._loop) ++ self._waiters.append(waiter) + yield from waiter + + + class BaseEventLoop(events.AbstractEventLoop): + + def __init__(self): ++ self._closed = False + self._ready = collections.deque() + self._scheduled = [] + self._default_executor = None +@@ -126,7 +153,26 @@ + self._running = False + self._clock_resolution = time.get_clock_info('monotonic').resolution + self._exception_handler = None +- self._debug = False ++ self._debug = (not sys.flags.ignore_environment ++ and bool(os.environ.get('PYTHONASYNCIODEBUG'))) ++ # In debug mode, if the execution of a callback or a step of a task ++ # exceed this duration in seconds, the slow callback/task is logged. ++ self.slow_callback_duration = 0.1 ++ ++ def __repr__(self): ++ return ('<%s running=%s closed=%s debug=%s>' ++ % (self.__class__.__name__, self.is_running(), ++ self.is_closed(), self.get_debug())) ++ ++ def create_task(self, coro): ++ """Schedule a coroutine object. ++ ++ Return a task object. ++ """ ++ task = tasks.Task(coro, loop=self) ++ if task._source_traceback: ++ del task._source_traceback[-1] ++ return task + + def _make_socket_transport(self, sock, protocol, waiter=None, *, + extra=None, server=None): +@@ -140,7 +186,7 @@ + raise NotImplementedError + + def _make_datagram_transport(self, sock, protocol, +- address=None, extra=None): ++ address=None, waiter=None, extra=None): + """Create datagram transport.""" + raise NotImplementedError + +@@ -154,27 +200,33 @@ + """Create write pipe transport.""" + raise NotImplementedError + +- @tasks.coroutine ++ @coroutine + def _make_subprocess_transport(self, protocol, args, shell, + stdin, stdout, stderr, bufsize, + extra=None, **kwargs): + """Create subprocess transport.""" + raise NotImplementedError + +- def _read_from_self(self): +- """XXX""" +- raise NotImplementedError ++ def _write_to_self(self): ++ """Write a byte to self-pipe, to wake up the event loop. + +- def _write_to_self(self): +- """XXX""" ++ This may be called from a different thread. ++ ++ The subclass is responsible for implementing the self-pipe. ++ """ + raise NotImplementedError + + def _process_events(self, event_list): + """Process selector events.""" + raise NotImplementedError + ++ def _check_closed(self): ++ if self._closed: ++ raise RuntimeError('Event loop is closed') ++ + def run_forever(self): + """Run until stop() is called.""" ++ self._check_closed() + if self._running: + raise RuntimeError('Event loop is running.') + self._running = True +@@ -192,13 +244,21 @@ + + If the argument is a coroutine, it is wrapped in a Task. + +- XXX TBD: It would be disastrous to call run_until_complete() ++ WARNING: It would be disastrous to call run_until_complete() + with the same coroutine twice -- it would wrap it in two + different Tasks and that can't be good. + + Return the Future's result, or raise its exception. + """ ++ self._check_closed() ++ ++ new_task = not isinstance(future, futures.Future) + future = tasks.async(future, loop=self) ++ if new_task: ++ # An exception is raised if the future didn't complete, so there ++ # is no need to log the "destroy pending task" message ++ future._log_destroy_pending = False ++ + future.add_done_callback(_raise_stop_error) + self.run_forever() + future.remove_done_callback(_raise_stop_error) +@@ -210,9 +270,9 @@ + def stop(self): + """Stop running the event loop. + +- Every callback scheduled before stop() is called will run. +- Callback scheduled after stop() is called won't. However, +- those callbacks will run if run() is called again later. ++ Every callback scheduled before stop() is called will run. Callbacks ++ scheduled after stop() is called will not run. However, those callbacks ++ will run if run_forever is called again later. + """ + self.call_soon(_raise_stop_error) + +@@ -221,7 +281,16 @@ + + This clears the queues and shuts down the executor, + but does not wait for the executor to finish. ++ ++ The event loop must not be running. + """ ++ if self._running: ++ raise RuntimeError("Cannot close a running event loop") ++ if self._closed: ++ return ++ if self._debug: ++ logger.debug("Close %r", self) ++ self._closed = True + self._ready.clear() + self._scheduled.clear() + executor = self._default_executor +@@ -229,12 +298,21 @@ + self._default_executor = None + executor.shutdown(wait=False) + ++ def is_closed(self): ++ """Returns True if the event loop was closed.""" ++ return self._closed ++ + def is_running(self): +- """Returns running status of event loop.""" ++ """Returns True if the event loop is running.""" + return self._running + + def time(self): +- """Return the time according to the event loop's clock.""" ++ """Return the time according to the event loop's clock. ++ ++ This is a float expressed in seconds since an epoch, but the ++ epoch, precision, accuracy and drift are unspecified and may ++ differ per event loop. ++ """ + return time.monotonic() + + def call_later(self, delay, callback, *args): +@@ -244,7 +322,7 @@ + can be used to cancel the call. + + The delay can be an int or float, expressed in seconds. It is +- always a relative time. ++ always relative to the current time. + + Each callback will be called exactly once. If two callbacks + are scheduled for exactly the same time, it undefined which +@@ -253,62 +331,81 @@ + Any positional arguments after the callback will be passed to + the callback when it is called. + """ +- return self.call_at(self.time() + delay, callback, *args) ++ timer = self.call_at(self.time() + delay, callback, *args) ++ if timer._source_traceback: ++ del timer._source_traceback[-1] ++ return timer + + def call_at(self, when, callback, *args): +- """Like call_later(), but uses an absolute time.""" +- if tasks.iscoroutinefunction(callback): ++ """Like call_later(), but uses an absolute time. ++ ++ Absolute time corresponds to the event loop's time() method. ++ """ ++ if coroutines.iscoroutinefunction(callback): + raise TypeError("coroutines cannot be used with call_at()") + if self._debug: + self._assert_is_current_event_loop() + timer = events.TimerHandle(when, callback, args, self) ++ if timer._source_traceback: ++ del timer._source_traceback[-1] + heapq.heappush(self._scheduled, timer) + return timer + + def call_soon(self, callback, *args): + """Arrange for a callback to be called as soon as possible. + +- This operates as a FIFO queue, callbacks are called in the ++ This operates as a FIFO queue: callbacks are called in the + order in which they are registered. Each callback will be + called exactly once. + + Any positional arguments after the callback will be passed to + the callback when it is called. + """ +- return self._call_soon(callback, args, check_loop=True) ++ handle = self._call_soon(callback, args, check_loop=True) ++ if handle._source_traceback: ++ del handle._source_traceback[-1] ++ return handle + + def _call_soon(self, callback, args, check_loop): +- if tasks.iscoroutinefunction(callback): ++ if coroutines.iscoroutinefunction(callback): + raise TypeError("coroutines cannot be used with call_soon()") + if self._debug and check_loop: + self._assert_is_current_event_loop() + handle = events.Handle(callback, args, self) ++ if handle._source_traceback: ++ del handle._source_traceback[-1] + self._ready.append(handle) + return handle + + def _assert_is_current_event_loop(self): + """Asserts that this event loop is the current event loop. + +- Non-threadsafe methods of this class make this assumption and will ++ Non-thread-safe methods of this class make this assumption and will + likely behave incorrectly when the assumption is violated. + +- Should only be called when (self._debug == True). The caller is ++ Should only be called when (self._debug == True). The caller is + responsible for checking this condition for performance reasons. + """ +- if events.get_event_loop() is not self: ++ try: ++ current = events.get_event_loop() ++ except AssertionError: ++ return ++ if current is not self: + raise RuntimeError( +- "non-threadsafe operation invoked on an event loop other " ++ "Non-thread-safe operation invoked on an event loop other " + "than the current one") + + def call_soon_threadsafe(self, callback, *args): +- """XXX""" ++ """Like call_soon(), but thread-safe.""" + handle = self._call_soon(callback, args, check_loop=False) ++ if handle._source_traceback: ++ del handle._source_traceback[-1] + self._write_to_self() + return handle + + def run_in_executor(self, executor, callback, *args): +- if tasks.iscoroutinefunction(callback): +- raise TypeError("coroutines cannot be used with run_in_executor()") ++ if coroutines.iscoroutinefunction(callback): ++ raise TypeError("Coroutines cannot be used with run_in_executor()") + if isinstance(callback, events.Handle): + assert not args + assert not isinstance(callback, events.TimerHandle) +@@ -327,19 +424,58 @@ + def set_default_executor(self, executor): + self._default_executor = executor + ++ def _getaddrinfo_debug(self, host, port, family, type, proto, flags): ++ msg = ["%s:%r" % (host, port)] ++ if family: ++ msg.append('family=%r' % family) ++ if type: ++ msg.append('type=%r' % type) ++ if proto: ++ msg.append('proto=%r' % proto) ++ if flags: ++ msg.append('flags=%r' % flags) ++ msg = ', '.join(msg) ++ logger.debug('Get address info %s', msg) ++ ++ t0 = self.time() ++ addrinfo = socket.getaddrinfo(host, port, family, type, proto, flags) ++ dt = self.time() - t0 ++ ++ msg = ('Getting address info %s took %.3f ms: %r' ++ % (msg, dt * 1e3, addrinfo)) ++ if dt >= self.slow_callback_duration: ++ logger.info(msg) ++ else: ++ logger.debug(msg) ++ return addrinfo ++ + def getaddrinfo(self, host, port, *, + family=0, type=0, proto=0, flags=0): +- return self.run_in_executor(None, socket.getaddrinfo, +- host, port, family, type, proto, flags) ++ if self._debug: ++ return self.run_in_executor(None, self._getaddrinfo_debug, ++ host, port, family, type, proto, flags) ++ else: ++ return self.run_in_executor(None, socket.getaddrinfo, ++ host, port, family, type, proto, flags) + + def getnameinfo(self, sockaddr, flags=0): + return self.run_in_executor(None, socket.getnameinfo, sockaddr, flags) + +- @tasks.coroutine ++ @coroutine + def create_connection(self, protocol_factory, host=None, port=None, *, + ssl=None, family=0, proto=0, flags=0, sock=None, + local_addr=None, server_hostname=None): +- """XXX""" ++ """Connect to a TCP server. ++ ++ Create a streaming transport connection to a given Internet host and ++ port: socket family AF_INET or socket.AF_INET6 depending on host (or ++ family if specified), socket type SOCK_STREAM. protocol_factory must be ++ a callable returning a protocol instance. ++ ++ This method is a coroutine which will try to establish the connection ++ in the background. When successful, the coroutine returns a ++ (transport, protocol) pair. ++ """ + if server_hostname is not None and not ssl: + raise ValueError('server_hostname is only meaningful with ssl') + +@@ -407,11 +543,17 @@ + sock.close() + sock = None + continue ++ if self._debug: ++ logger.debug("connect %r to %r", sock, address) + yield from self.sock_connect(sock, address) + except OSError as exc: + if sock is not None: + sock.close() + exceptions.append(exc) ++ except: ++ if sock is not None: ++ sock.close() ++ raise + else: + break + else: +@@ -435,9 +577,12 @@ + + transport, protocol = yield from self._create_connection_transport( + sock, protocol_factory, ssl, server_hostname) ++ if self._debug: ++ logger.debug("%r connected to %s:%r: (%r, %r)", ++ sock, host, port, transport, protocol) + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def _create_connection_transport(self, sock, protocol_factory, ssl, + server_hostname): + protocol = protocol_factory() +@@ -453,7 +598,7 @@ + yield from waiter + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def create_datagram_endpoint(self, protocol_factory, + local_addr=None, remote_addr=None, *, + family=0, proto=0, flags=0): +@@ -463,7 +608,7 @@ + raise ValueError('unexpected address family') + addr_pairs_info = (((family, proto), (None, None)),) + else: +- # join addresss by (family, protocol) ++ # join address by (family, protocol) + addr_infos = collections.OrderedDict() + for idx, addr in ((0, local_addr), (1, remote_addr)): + if addr is not None: +@@ -512,16 +657,32 @@ + if sock is not None: + sock.close() + exceptions.append(exc) ++ except: ++ if sock is not None: ++ sock.close() ++ raise + else: + break + else: + raise exceptions[0] + + protocol = protocol_factory() +- transport = self._make_datagram_transport(sock, protocol, r_addr) ++ waiter = futures.Future(loop=self) ++ transport = self._make_datagram_transport(sock, protocol, r_addr, ++ waiter) ++ if self._debug: ++ if local_addr: ++ logger.info("Datagram endpoint local_addr=%r remote_addr=%r " ++ "created: (%r, %r)", ++ local_addr, remote_addr, transport, protocol) ++ else: ++ logger.debug("Datagram endpoint remote_addr=%r created: " ++ "(%r, %r)", ++ remote_addr, transport, protocol) ++ yield from waiter + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def create_server(self, protocol_factory, host=None, port=None, + *, + family=socket.AF_UNSPEC, +@@ -530,7 +691,12 @@ + backlog=100, + ssl=None, + reuse_address=None): +- """XXX""" ++ """Create a TCP server bound to host and port. ++ ++ Return a Server object which can be used to stop the service. ++ ++ This method is a coroutine. ++ """ + if isinstance(ssl, bool): + raise TypeError('ssl argument must be an SSLContext or None') + if host is not None or port is not None: +@@ -584,8 +750,7 @@ + sock.close() + else: + if sock is None: +- raise ValueError( +- 'host and port was not specified and no sock specified') ++ raise ValueError('Neither host/port nor sock were specified') + sockets = [sock] + + server = Server(self, sockets) +@@ -593,25 +758,46 @@ + sock.listen(backlog) + sock.setblocking(False) + self._start_serving(protocol_factory, sock, ssl, server) ++ if self._debug: ++ logger.info("%r is serving", server) + return server + +- @tasks.coroutine ++ @coroutine + def connect_read_pipe(self, protocol_factory, pipe): + protocol = protocol_factory() + waiter = futures.Future(loop=self) + transport = self._make_read_pipe_transport(pipe, protocol, waiter) + yield from waiter ++ if self._debug: ++ logger.debug('Read pipe %r connected: (%r, %r)', ++ pipe.fileno(), transport, protocol) + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def connect_write_pipe(self, protocol_factory, pipe): + protocol = protocol_factory() + waiter = futures.Future(loop=self) + transport = self._make_write_pipe_transport(pipe, protocol, waiter) + yield from waiter ++ if self._debug: ++ logger.debug('Write pipe %r connected: (%r, %r)', ++ pipe.fileno(), transport, protocol) + return transport, protocol + +- @tasks.coroutine ++ def _log_subprocess(self, msg, stdin, stdout, stderr): ++ info = [msg] ++ if stdin is not None: ++ info.append('stdin=%s' % _format_pipe(stdin)) ++ if stdout is not None and stderr == subprocess.STDOUT: ++ info.append('stdout=stderr=%s' % _format_pipe(stdout)) ++ else: ++ if stdout is not None: ++ info.append('stdout=%s' % _format_pipe(stdout)) ++ if stderr is not None: ++ info.append('stderr=%s' % _format_pipe(stderr)) ++ logger.debug(' '.join(info)) ++ ++ @coroutine + def subprocess_shell(self, protocol_factory, cmd, *, stdin=subprocess.PIPE, + stdout=subprocess.PIPE, stderr=subprocess.PIPE, + universal_newlines=False, shell=True, bufsize=0, +@@ -625,11 +811,18 @@ + if bufsize != 0: + raise ValueError("bufsize must be 0") + protocol = protocol_factory() ++ if self._debug: ++ # don't log parameters: they may contain sensitive information ++ # (password) and may be too long ++ debug_log = 'run shell command %r' % cmd ++ self._log_subprocess(debug_log, stdin, stdout, stderr) + transport = yield from self._make_subprocess_transport( + protocol, cmd, True, stdin, stdout, stderr, bufsize, **kwargs) ++ if self._debug: ++ logger.info('%s: %r' % (debug_log, transport)) + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def subprocess_exec(self, protocol_factory, program, *args, + stdin=subprocess.PIPE, stdout=subprocess.PIPE, + stderr=subprocess.PIPE, universal_newlines=False, +@@ -647,9 +840,16 @@ + "a bytes or text string, not %s" + % type(arg).__name__) + protocol = protocol_factory() ++ if self._debug: ++ # don't log parameters: they may contain sensitive information ++ # (password) and may be too long ++ debug_log = 'execute program %r' % program ++ self._log_subprocess(debug_log, stdin, stdout, stderr) + transport = yield from self._make_subprocess_transport( + protocol, popen_args, False, stdin, stdout, stderr, + bufsize, **kwargs) ++ if self._debug: ++ logger.info('%s: %r' % (debug_log, transport)) + return transport, protocol + + def set_exception_handler(self, handler): +@@ -659,7 +859,7 @@ + be set. + + If handler is a callable object, it should have a +- matching signature to '(loop, context)', where 'loop' ++ signature matching '(loop, context)', where 'loop' + will be a reference to the active event loop, 'context' + will be a dict object (see `call_exception_handler()` + documentation for details about context). +@@ -676,7 +876,7 @@ + handler is set, and can be called by a custom exception + handler that wants to defer to the default behavior. + +- context parameter has the same meaning as in ++ The context parameter has the same meaning as in + `call_exception_handler()`. + """ + message = context.get('message') +@@ -693,15 +893,22 @@ + for key in sorted(context): + if key in {'message', 'exception'}: + continue +- log_lines.append('{}: {!r}'.format(key, context[key])) ++ value = context[key] ++ if key == 'source_traceback': ++ tb = ''.join(traceback.format_list(value)) ++ value = 'Object created at (most recent call last):\n' ++ value += tb.rstrip() ++ else: ++ value = repr(value) ++ log_lines.append('{}: {}'.format(key, value)) + + logger.error('\n'.join(log_lines), exc_info=exc_info) + + def call_exception_handler(self, context): +- """Call the current event loop exception handler. ++ """Call the current event loop's exception handler. + +- context is a dict object containing the following keys +- (new keys maybe introduced later): ++ The context argument is a dict containing the following keys: ++ + - 'message': Error message; + - 'exception' (optional): Exception object; + - 'future' (optional): Future instance; +@@ -710,8 +917,10 @@ + - 'transport' (optional): Transport instance; + - 'socket' (optional): Socket instance. + +- Note: this method should not be overloaded in subclassed +- event loops. For any custom exception handling, use ++ New keys maybe introduced in the future. ++ ++ Note: do not overload this method in an event loop subclass. ++ For custom exception handling, use the + `set_exception_handler()` method. + """ + if self._exception_handler is None: +@@ -736,7 +945,7 @@ + 'context': context, + }) + except Exception: +- # Guard 'default_exception_handler' in case it's ++ # Guard 'default_exception_handler' in case it is + # overloaded. + logger.error('Exception in default exception handler ' + 'while handling an unexpected error ' +@@ -744,7 +953,7 @@ + exc_info=True) + + def _add_callback(self, handle): +- """Add a Handle to ready or scheduled.""" ++ """Add a Handle to _scheduled (TimerHandle) or _ready.""" + assert isinstance(handle, events.Handle), 'A Handle is required here' + if handle._cancelled: + return +@@ -777,20 +986,26 @@ + when = self._scheduled[0]._when + timeout = max(0, when - self.time()) + +- # TODO: Instrumentation only in debug mode? +- if logger.isEnabledFor(logging.INFO): ++ if self._debug and timeout != 0: + t0 = self.time() + event_list = self._selector.select(timeout) +- t1 = self.time() +- if t1-t0 >= 1: ++ dt = self.time() - t0 ++ if dt >= 1.0: + level = logging.INFO + else: + level = logging.DEBUG +- if timeout is not None: +- logger.log(level, 'poll %.3f took %.3f seconds', +- timeout, t1-t0) +- else: +- logger.log(level, 'poll took %.3f seconds', t1-t0) ++ nevent = len(event_list) ++ if timeout is None: ++ logger.log(level, 'poll took %.3f ms: %s events', ++ dt * 1e3, nevent) ++ elif nevent: ++ logger.log(level, ++ 'poll %.3f ms took %.3f ms: %s events', ++ timeout * 1e3, dt * 1e3, nevent) ++ elif dt >= 1.0: ++ logger.log(level, ++ 'poll %.3f ms took %.3f ms: timeout', ++ timeout * 1e3, dt * 1e3) + else: + event_list = self._selector.select(timeout) + self._process_events(event_list) +@@ -809,11 +1024,20 @@ + # Note: We run all currently scheduled callbacks, but not any + # callbacks scheduled by callbacks run this time around -- + # they will be run the next time (after another I/O poll). +- # Use an idiom that is threadsafe without using locks. ++ # Use an idiom that is thread-safe without using locks. + ntodo = len(self._ready) + for i in range(ntodo): + handle = self._ready.popleft() +- if not handle._cancelled: ++ if handle._cancelled: ++ continue ++ if self._debug: ++ t0 = self.time() ++ handle._run() ++ dt = self.time() - t0 ++ if dt >= self.slow_callback_duration: ++ logger.warning('Executing %s took %.3f seconds', ++ _format_handle(handle), dt) ++ else: + handle._run() + handle = None # Needed to break cycles when an exception occurs. + +diff -r c0e311e010fc Lib/asyncio/base_subprocess.py +--- a/Lib/asyncio/base_subprocess.py ++++ b/Lib/asyncio/base_subprocess.py +@@ -2,8 +2,9 @@ + import subprocess + + from . import protocols +-from . import tasks + from . import transports ++from .coroutines import coroutine ++from .log import logger + + + class BaseSubprocessTransport(transports.SubprocessTransport): +@@ -14,6 +15,7 @@ + super().__init__(extra) + self._protocol = protocol + self._loop = loop ++ self._pid = None + + self._pipes = {} + if stdin == subprocess.PIPE: +@@ -27,7 +29,36 @@ + self._returncode = None + self._start(args=args, shell=shell, stdin=stdin, stdout=stdout, + stderr=stderr, bufsize=bufsize, **kwargs) ++ self._pid = self._proc.pid + self._extra['subprocess'] = self._proc ++ if self._loop.get_debug(): ++ if isinstance(args, (bytes, str)): ++ program = args ++ else: ++ program = args[0] ++ logger.debug('process %r created: pid %s', ++ program, self._pid) ++ ++ def __repr__(self): ++ info = [self.__class__.__name__, 'pid=%s' % self._pid] ++ if self._returncode is not None: ++ info.append('returncode=%s' % self._returncode) ++ ++ stdin = self._pipes.get(0) ++ if stdin is not None: ++ info.append('stdin=%s' % stdin.pipe) ++ ++ stdout = self._pipes.get(1) ++ stderr = self._pipes.get(2) ++ if stdout is not None and stderr is stdout: ++ info.append('stdout=stderr=%s' % stdout.pipe) ++ else: ++ if stdout is not None: ++ info.append('stdout=%s' % stdout.pipe) ++ if stderr is not None: ++ info.append('stderr=%s' % stderr.pipe) ++ ++ return '<%s>' % ' '.join(info) + + def _start(self, args, shell, stdin, stdout, stderr, bufsize, **kwargs): + raise NotImplementedError +@@ -45,7 +76,7 @@ + self.terminate() + + def get_pid(self): +- return self._proc.pid ++ return self._pid + + def get_returncode(self): + return self._returncode +@@ -65,7 +96,7 @@ + def kill(self): + self._proc.kill() + +- @tasks.coroutine ++ @coroutine + def _post_init(self): + proc = self._proc + loop = self._loop +@@ -108,6 +139,9 @@ + def _process_exited(self, returncode): + assert returncode is not None, returncode + assert self._returncode is None, self._returncode ++ if self._loop.get_debug(): ++ logger.info('%r exited with return code %r', ++ self, returncode) + self._returncode = returncode + self._call(self._protocol.process_exited) + self._try_finish() +@@ -141,6 +175,10 @@ + def connection_made(self, transport): + self.pipe = transport + ++ def __repr__(self): ++ return ('<%s fd=%s pipe=%r>' ++ % (self.__class__.__name__, self.fd, self.pipe)) ++ + def connection_lost(self, exc): + self.disconnected = True + self.proc._pipe_connection_lost(self.fd, exc) +diff -r c0e311e010fc Lib/asyncio/coroutines.py +--- /dev/null ++++ b/Lib/asyncio/coroutines.py +@@ -0,0 +1,195 @@ ++__all__ = ['coroutine', ++ 'iscoroutinefunction', 'iscoroutine'] ++ ++import functools ++import inspect ++import opcode ++import os ++import sys ++import traceback ++import types ++ ++from . import events ++from . import futures ++from .log import logger ++ ++ ++# Opcode of "yield from" instruction ++_YIELD_FROM = opcode.opmap['YIELD_FROM'] ++ ++# If you set _DEBUG to true, @coroutine will wrap the resulting ++# generator objects in a CoroWrapper instance (defined below). That ++# instance will log a message when the generator is never iterated ++# over, which may happen when you forget to use "yield from" with a ++# coroutine call. Note that the value of the _DEBUG flag is taken ++# when the decorator is used, so to be of any use it must be set ++# before you define your coroutines. A downside of using this feature ++# is that tracebacks show entries for the CoroWrapper.__next__ method ++# when _DEBUG is true. ++_DEBUG = (not sys.flags.ignore_environment ++ and bool(os.environ.get('PYTHONASYNCIODEBUG'))) ++ ++ ++# Check for CPython issue #21209 ++def has_yield_from_bug(): ++ class MyGen: ++ def __init__(self): ++ self.send_args = None ++ def __iter__(self): ++ return self ++ def __next__(self): ++ return 42 ++ def send(self, *what): ++ self.send_args = what ++ return None ++ def yield_from_gen(gen): ++ yield from gen ++ value = (1, 2, 3) ++ gen = MyGen() ++ coro = yield_from_gen(gen) ++ next(coro) ++ coro.send(value) ++ return gen.send_args != (value,) ++_YIELD_FROM_BUG = has_yield_from_bug() ++del has_yield_from_bug ++ ++ ++class CoroWrapper: ++ # Wrapper for coroutine object in _DEBUG mode. ++ ++ def __init__(self, gen, func): ++ assert inspect.isgenerator(gen), gen ++ self.gen = gen ++ self.func = func ++ self._source_traceback = traceback.extract_stack(sys._getframe(1)) ++ # __name__, __qualname__, __doc__ attributes are set by the coroutine() ++ # decorator ++ ++ def __repr__(self): ++ coro_repr = _format_coroutine(self) ++ if self._source_traceback: ++ frame = self._source_traceback[-1] ++ coro_repr += ', created at %s:%s' % (frame[0], frame[1]) ++ return '<%s %s>' % (self.__class__.__name__, coro_repr) ++ ++ def __iter__(self): ++ return self ++ ++ def __next__(self): ++ return next(self.gen) ++ ++ if _YIELD_FROM_BUG: ++ # For for CPython issue #21209: using "yield from" and a custom ++ # generator, generator.send(tuple) unpacks the tuple instead of passing ++ # the tuple unchanged. Check if the caller is a generator using "yield ++ # from" to decide if the parameter should be unpacked or not. ++ def send(self, *value): ++ frame = sys._getframe() ++ caller = frame.f_back ++ assert caller.f_lasti >= 0 ++ if caller.f_code.co_code[caller.f_lasti] != _YIELD_FROM: ++ value = value[0] ++ return self.gen.send(value) ++ else: ++ def send(self, value): ++ return self.gen.send(value) ++ ++ def throw(self, exc): ++ return self.gen.throw(exc) ++ ++ def close(self): ++ return self.gen.close() ++ ++ @property ++ def gi_frame(self): ++ return self.gen.gi_frame ++ ++ @property ++ def gi_running(self): ++ return self.gen.gi_running ++ ++ @property ++ def gi_code(self): ++ return self.gen.gi_code ++ ++ def __del__(self): ++ # Be careful accessing self.gen.frame -- self.gen might not exist. ++ gen = getattr(self, 'gen', None) ++ frame = getattr(gen, 'gi_frame', None) ++ if frame is not None and frame.f_lasti == -1: ++ msg = '%r was never yielded from' % self ++ tb = getattr(self, '_source_traceback', ()) ++ if tb: ++ tb = ''.join(traceback.format_list(tb)) ++ msg += ('\nCoroutine object created at ' ++ '(most recent call last):\n') ++ msg += tb.rstrip() ++ logger.error(msg) ++ ++ ++def coroutine(func): ++ """Decorator to mark coroutines. ++ ++ If the coroutine is not yielded from before it is destroyed, ++ an error message is logged. ++ """ ++ if inspect.isgeneratorfunction(func): ++ coro = func ++ else: ++ @functools.wraps(func) ++ def coro(*args, **kw): ++ res = func(*args, **kw) ++ if isinstance(res, futures.Future) or inspect.isgenerator(res): ++ res = yield from res ++ return res ++ ++ if not _DEBUG: ++ wrapper = coro ++ else: ++ @functools.wraps(func) ++ def wrapper(*args, **kwds): ++ w = CoroWrapper(coro(*args, **kwds), func) ++ if w._source_traceback: ++ del w._source_traceback[-1] ++ w.__name__ = func.__name__ ++ if hasattr(func, '__qualname__'): ++ w.__qualname__ = func.__qualname__ ++ w.__doc__ = func.__doc__ ++ return w ++ ++ wrapper._is_coroutine = True # For iscoroutinefunction(). ++ return wrapper ++ ++ ++def iscoroutinefunction(func): ++ """Return True if func is a decorated coroutine function.""" ++ return getattr(func, '_is_coroutine', False) ++ ++ ++_COROUTINE_TYPES = (types.GeneratorType, CoroWrapper) ++ ++def iscoroutine(obj): ++ """Return True if obj is a coroutine object.""" ++ return isinstance(obj, _COROUTINE_TYPES) ++ ++ ++def _format_coroutine(coro): ++ assert iscoroutine(coro) ++ coro_name = getattr(coro, '__qualname__', coro.__name__) ++ ++ filename = coro.gi_code.co_filename ++ if (isinstance(coro, CoroWrapper) ++ and not inspect.isgeneratorfunction(coro.func)): ++ filename, lineno = events._get_function_source(coro.func) ++ if coro.gi_frame is None: ++ coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) ++ else: ++ coro_repr = '%s() running, defined at %s:%s' % (coro_name, filename, lineno) ++ elif coro.gi_frame is not None: ++ lineno = coro.gi_frame.f_lineno ++ coro_repr = '%s() running at %s:%s' % (coro_name, filename, lineno) ++ else: ++ lineno = coro.gi_code.co_firstlineno ++ coro_repr = '%s() done, defined at %s:%s' % (coro_name, filename, lineno) ++ ++ return coro_repr +diff -r c0e311e010fc Lib/asyncio/events.py +--- a/Lib/asyncio/events.py ++++ b/Lib/asyncio/events.py +@@ -8,15 +8,67 @@ + 'get_child_watcher', 'set_child_watcher', + ] + ++import functools ++import inspect + import subprocess ++import traceback + import threading + import socket ++import sys ++ ++ ++_PY34 = sys.version_info >= (3, 4) ++ ++ ++def _get_function_source(func): ++ if _PY34: ++ func = inspect.unwrap(func) ++ elif hasattr(func, '__wrapped__'): ++ func = func.__wrapped__ ++ if inspect.isfunction(func): ++ code = func.__code__ ++ return (code.co_filename, code.co_firstlineno) ++ if isinstance(func, functools.partial): ++ return _get_function_source(func.func) ++ if _PY34 and isinstance(func, functools.partialmethod): ++ return _get_function_source(func.func) ++ return None ++ ++ ++def _format_args(args): ++ # function formatting ('hello',) as ('hello') ++ args_repr = repr(args) ++ if len(args) == 1 and args_repr.endswith(',)'): ++ args_repr = args_repr[:-2] + ')' ++ return args_repr ++ ++ ++def _format_callback(func, args, suffix=''): ++ if isinstance(func, functools.partial): ++ if args is not None: ++ suffix = _format_args(args) + suffix ++ return _format_callback(func.func, func.args, suffix) ++ ++ func_repr = getattr(func, '__qualname__', None) ++ if not func_repr: ++ func_repr = repr(func) ++ ++ if args is not None: ++ func_repr += _format_args(args) ++ if suffix: ++ func_repr += suffix ++ ++ source = _get_function_source(func) ++ if source: ++ func_repr += ' at %s:%s' % source ++ return func_repr + + + class Handle: + """Object returned by callback registration methods.""" + +- __slots__ = ['_callback', '_args', '_cancelled', '_loop', '__weakref__'] ++ __slots__ = ('_callback', '_args', '_cancelled', '_loop', ++ '_source_traceback', '__weakref__') + + def __init__(self, callback, args, loop): + assert not isinstance(callback, Handle), 'A Handle is not a callback' +@@ -24,27 +76,41 @@ + self._callback = callback + self._args = args + self._cancelled = False ++ if self._loop.get_debug(): ++ self._source_traceback = traceback.extract_stack(sys._getframe(1)) ++ else: ++ self._source_traceback = None + + def __repr__(self): +- res = 'Handle({}, {})'.format(self._callback, self._args) ++ info = [self.__class__.__name__] + if self._cancelled: +- res += '' +- return res ++ info.append('cancelled') ++ if self._callback is not None: ++ info.append(_format_callback(self._callback, self._args)) ++ if self._source_traceback: ++ frame = self._source_traceback[-1] ++ info.append('created at %s:%s' % (frame[0], frame[1])) ++ return '<%s>' % ' '.join(info) + + def cancel(self): + self._cancelled = True ++ self._callback = None ++ self._args = None + + def _run(self): + try: + self._callback(*self._args) + except Exception as exc: +- msg = 'Exception in callback {}{!r}'.format(self._callback, +- self._args) +- self._loop.call_exception_handler({ ++ cb = _format_callback(self._callback, self._args) ++ msg = 'Exception in callback {}'.format(cb) ++ context = { + 'message': msg, + 'exception': exc, + 'handle': self, +- }) ++ } ++ if self._source_traceback: ++ context['source_traceback'] = self._source_traceback ++ self._loop.call_exception_handler(context) + self = None # Needed to break cycles when an exception occurs. + + +@@ -56,17 +122,21 @@ + def __init__(self, when, callback, args, loop): + assert when is not None + super().__init__(callback, args, loop) +- ++ if self._source_traceback: ++ del self._source_traceback[-1] + self._when = when + + def __repr__(self): +- res = 'TimerHandle({}, {}, {})'.format(self._when, +- self._callback, +- self._args) ++ info = [] + if self._cancelled: +- res += '' +- +- return res ++ info.append('cancelled') ++ info.append('when=%s' % self._when) ++ if self._callback is not None: ++ info.append(_format_callback(self._callback, self._args)) ++ if self._source_traceback: ++ frame = self._source_traceback[-1] ++ info.append('created at %s:%s' % (frame[0], frame[1])) ++ return '<%s %s>' % (self.__class__.__name__, ' '.join(info)) + + def __hash__(self): + return hash(self._when) +@@ -140,6 +210,10 @@ + """Return whether the event loop is currently running.""" + raise NotImplementedError + ++ def is_closed(self): ++ """Returns True if the event loop was closed.""" ++ raise NotImplementedError ++ + def close(self): + """Close the loop. + +@@ -165,6 +239,11 @@ + def time(self): + raise NotImplementedError + ++ # Method scheduling a coroutine object: create a task. ++ ++ def create_task(self, coro): ++ raise NotImplementedError ++ + # Methods for interacting with threads. + + def call_soon_threadsafe(self, callback, *args): +@@ -257,11 +336,11 @@ + # Pipes and subprocesses. + + def connect_read_pipe(self, protocol_factory, pipe): +- """Register read pipe in event loop. ++ """Register read pipe in event loop. Set the pipe to non-blocking mode. + + protocol_factory should instantiate object with Protocol interface. +- pipe is file-like object already switched to nonblocking. +- Return pair (transport, protocol), where transport support ++ pipe is a file-like object. ++ Return pair (transport, protocol), where transport supports the + ReadTransport interface.""" + # The reason to accept file-like object instead of just file descriptor + # is: we need to own pipe and close it at transport finishing +@@ -355,25 +434,33 @@ + """Abstract policy for accessing the event loop.""" + + def get_event_loop(self): +- """XXX""" ++ """Get the event loop for the current context. ++ ++ Returns an event loop object implementing the BaseEventLoop interface, ++ or raises an exception in case no event loop has been set for the ++ current context and the current policy does not specify to create one. ++ ++ It should never return None.""" + raise NotImplementedError + + def set_event_loop(self, loop): +- """XXX""" ++ """Set the event loop for the current context to loop.""" + raise NotImplementedError + + def new_event_loop(self): +- """XXX""" ++ """Create and return a new event loop object according to this ++ policy's rules. If there's need to set this loop as the event loop for ++ the current context, set_event_loop must be called explicitly.""" + raise NotImplementedError + + # Child processes handling (Unix only). + + def get_child_watcher(self): +- """XXX""" ++ "Get the watcher for child processes." + raise NotImplementedError + + def set_child_watcher(self, watcher): +- """XXX""" ++ """Set the watcher for child processes.""" + raise NotImplementedError + + +@@ -447,39 +534,42 @@ + + + def get_event_loop_policy(): +- """XXX""" ++ """Get the current event loop policy.""" + if _event_loop_policy is None: + _init_event_loop_policy() + return _event_loop_policy + + + def set_event_loop_policy(policy): +- """XXX""" ++ """Set the current event loop policy. ++ ++ If policy is None, the default policy is restored.""" + global _event_loop_policy + assert policy is None or isinstance(policy, AbstractEventLoopPolicy) + _event_loop_policy = policy + + + def get_event_loop(): +- """XXX""" ++ """Equivalent to calling get_event_loop_policy().get_event_loop().""" + return get_event_loop_policy().get_event_loop() + + + def set_event_loop(loop): +- """XXX""" ++ """Equivalent to calling get_event_loop_policy().set_event_loop(loop).""" + get_event_loop_policy().set_event_loop(loop) + + + def new_event_loop(): +- """XXX""" ++ """Equivalent to calling get_event_loop_policy().new_event_loop().""" + return get_event_loop_policy().new_event_loop() + + + def get_child_watcher(): +- """XXX""" ++ """Equivalent to calling get_event_loop_policy().get_child_watcher().""" + return get_event_loop_policy().get_child_watcher() + + + def set_child_watcher(watcher): +- """XXX""" ++ """Equivalent to calling ++ get_event_loop_policy().set_child_watcher(watcher).""" + return get_event_loop_policy().set_child_watcher(watcher) +diff -r c0e311e010fc Lib/asyncio/futures.py +--- a/Lib/asyncio/futures.py ++++ b/Lib/asyncio/futures.py +@@ -82,10 +82,11 @@ + in a discussion about closing files when they are collected. + """ + +- __slots__ = ['exc', 'tb', 'loop'] ++ __slots__ = ('loop', 'source_traceback', 'exc', 'tb') + +- def __init__(self, exc, loop): +- self.loop = loop ++ def __init__(self, future, exc): ++ self.loop = future._loop ++ self.source_traceback = future._source_traceback + self.exc = exc + self.tb = None + +@@ -102,11 +103,12 @@ + + def __del__(self): + if self.tb: +- msg = 'Future/Task exception was never retrieved:\n{tb}' +- context = { +- 'message': msg.format(tb=''.join(self.tb)), +- } +- self.loop.call_exception_handler(context) ++ msg = 'Future/Task exception was never retrieved' ++ if self.source_traceback: ++ msg += '\nFuture/Task created at (most recent call last):\n' ++ msg += ''.join(traceback.format_list(self.source_traceback)) ++ msg += ''.join(self.tb).rstrip() ++ self.loop.call_exception_handler({'message': msg}) + + + class Future: +@@ -149,26 +151,49 @@ + else: + self._loop = loop + self._callbacks = [] ++ if self._loop.get_debug(): ++ self._source_traceback = traceback.extract_stack(sys._getframe(1)) ++ else: ++ self._source_traceback = None ++ ++ def _format_callbacks(self): ++ cb = self._callbacks ++ size = len(cb) ++ if not size: ++ cb = '' ++ ++ def format_cb(callback): ++ return events._format_callback(callback, ()) ++ ++ if size == 1: ++ cb = format_cb(cb[0]) ++ elif size == 2: ++ cb = '{}, {}'.format(format_cb(cb[0]), format_cb(cb[1])) ++ elif size > 2: ++ cb = '{}, <{} more>, {}'.format(format_cb(cb[0]), ++ size-2, ++ format_cb(cb[-1])) ++ return 'cb=[%s]' % cb ++ ++ def _format_result(self): ++ if self._state != _FINISHED: ++ return None ++ elif self._exception is not None: ++ return 'exception={!r}'.format(self._exception) ++ else: ++ return 'result={!r}'.format(self._result) + + def __repr__(self): +- res = self.__class__.__name__ ++ info = [self._state.lower()] + if self._state == _FINISHED: +- if self._exception is not None: +- res += ''.format(self._exception) +- else: +- res += ''.format(self._result) +- elif self._callbacks: +- size = len(self._callbacks) +- if size > 2: +- res += '<{}, [{}, <{} more>, {}]>'.format( +- self._state, self._callbacks[0], +- size-2, self._callbacks[-1]) +- else: +- res += '<{}, {}>'.format(self._state, self._callbacks) +- else: +- res += '<{}>'.format(self._state) +- return res ++ info.append(self._format_result()) ++ if self._callbacks: ++ info.append(self._format_callbacks()) ++ return '<%s %s>' % (self.__class__.__name__, ' '.join(info)) + ++ # On Python 3.3 or older, objects with a destructor part of a reference ++ # cycle are never destroyed. It's not more the case on Python 3.4 thanks to ++ # the PEP 442. + if _PY34: + def __del__(self): + if not self._log_traceback: +@@ -177,10 +202,13 @@ + return + exc = self._exception + context = { +- 'message': 'Future/Task exception was never retrieved', ++ 'message': ('%s exception was never retrieved' ++ % self.__class__.__name__), + 'exception': exc, + 'future': self, + } ++ if self._source_traceback: ++ context['source_traceback'] = self._source_traceback + self._loop.call_exception_handler(context) + + def cancel(self): +@@ -288,6 +316,12 @@ + + # So-called internal methods (note: no set_running_or_notify_cancel()). + ++ def _set_result_unless_cancelled(self, result): ++ """Helper setting the result only if the future was not cancelled.""" ++ if self.cancelled(): ++ return ++ self.set_result(result) ++ + def set_result(self, result): + """Mark the future done and set its result. + +@@ -316,7 +350,7 @@ + if _PY34: + self._log_traceback = True + else: +- self._tb_logger = _TracebackLogger(exception, self._loop) ++ self._tb_logger = _TracebackLogger(self, exception) + # Arrange for the logger to be activated after all callbacks + # have had a chance to call result() or exception(). + self._loop.call_soon(self._tb_logger.activate) +diff -r c0e311e010fc Lib/asyncio/locks.py +--- a/Lib/asyncio/locks.py ++++ b/Lib/asyncio/locks.py +@@ -6,7 +6,7 @@ + + from . import events + from . import futures +-from . import tasks ++from .coroutines import coroutine + + + class _ContextManager: +@@ -112,7 +112,7 @@ + """Return True if lock is acquired.""" + return self._locked + +- @tasks.coroutine ++ @coroutine + def acquire(self): + """Acquire a lock. + +@@ -225,7 +225,7 @@ + to true again.""" + self._value = False + +- @tasks.coroutine ++ @coroutine + def wait(self): + """Block until the internal flag is true. + +@@ -278,7 +278,7 @@ + extra = '{},waiters:{}'.format(extra, len(self._waiters)) + return '<{} [{}]>'.format(res[1:-1], extra) + +- @tasks.coroutine ++ @coroutine + def wait(self): + """Wait until notified. + +@@ -306,7 +306,7 @@ + finally: + yield from self.acquire() + +- @tasks.coroutine ++ @coroutine + def wait_for(self, predicate): + """Wait until a predicate becomes true. + +@@ -402,7 +402,7 @@ + """Returns True if semaphore can not be acquired immediately.""" + return self._value == 0 + +- @tasks.coroutine ++ @coroutine + def acquire(self): + """Acquire a semaphore. + +diff -r c0e311e010fc Lib/asyncio/proactor_events.py +--- a/Lib/asyncio/proactor_events.py ++++ b/Lib/asyncio/proactor_events.py +@@ -35,10 +35,24 @@ + self._closing = False # Set when close() called. + self._eof_written = False + if self._server is not None: +- self._server.attach(self) ++ self._server._attach() + self._loop.call_soon(self._protocol.connection_made, self) + if waiter is not None: +- self._loop.call_soon(waiter.set_result, None) ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(waiter._set_result_unless_cancelled, None) ++ ++ def __repr__(self): ++ info = [self.__class__.__name__, 'fd=%s' % self._sock.fileno()] ++ if self._read_fut is not None: ++ info.append('read=%s' % self._read_fut) ++ if self._write_fut is not None: ++ info.append("write=%r" % self._write_fut) ++ if self._buffer: ++ bufsize = len(self._buffer) ++ info.append('write_bufsize=%s' % bufsize) ++ if self._eof_written: ++ info.append('EOF written') ++ return '<%s>' % ' '.join(info) + + def _set_extra(self, sock): + self._extra['pipe'] = sock +@@ -54,7 +68,10 @@ + self._read_fut.cancel() + + def _fatal_error(self, exc, message='Fatal error on pipe transport'): +- if not isinstance(exc, (BrokenPipeError, ConnectionResetError)): ++ if isinstance(exc, (BrokenPipeError, ConnectionResetError)): ++ if self._loop.get_debug(): ++ logger.debug("%r: %s", self, message, exc_info=True) ++ else: + self._loop.call_exception_handler({ + 'message': message, + 'exception': exc, +@@ -90,7 +107,7 @@ + self._sock.close() + server = self._server + if server is not None: +- server.detach(self) ++ server._detach() + self._server = None + + def get_write_buffer_size(self): +@@ -107,7 +124,6 @@ + def __init__(self, loop, sock, protocol, waiter=None, + extra=None, server=None): + super().__init__(loop, sock, protocol, waiter, extra, server) +- self._read_fut = None + self._paused = False + self._loop.call_soon(self._loop_reading) + +@@ -117,6 +133,8 @@ + if self._paused: + raise RuntimeError('Already paused') + self._paused = True ++ if self._loop.get_debug(): ++ logger.debug("%r pauses reading", self) + + def resume_reading(self): + if not self._paused: +@@ -125,6 +143,8 @@ + if self._closing: + return + self._loop.call_soon(self._loop_reading, self._read_fut) ++ if self._loop.get_debug(): ++ logger.debug("%r resumes reading", self) + + def _loop_reading(self, fut=None): + if self._paused: +@@ -165,6 +185,8 @@ + if data: + self._protocol.data_received(data) + elif data is not None: ++ if self._loop.get_debug(): ++ logger.debug("%r received EOF", self) + keep_open = self._protocol.eof_received() + if not keep_open: + self.close() +@@ -353,13 +375,14 @@ + sock, protocol, waiter, extra) + + def close(self): +- if self._proactor is not None: +- self._close_self_pipe() +- self._proactor.close() +- self._proactor = None +- self._selector = None +- super().close() +- self._accept_futures.clear() ++ if self.is_closed(): ++ return ++ super().close() ++ self._stop_accept_futures() ++ self._close_self_pipe() ++ self._proactor.close() ++ self._proactor = None ++ self._selector = None + + def sock_recv(self, sock, n): + return self._proactor.recv(sock, n) +@@ -399,7 +422,9 @@ + self._ssock.setblocking(False) + self._csock.setblocking(False) + self._internal_fds += 1 +- self.call_soon(self._loop_self_reading) ++ # don't check the current loop because _make_self_pipe() is called ++ # from the event loop constructor ++ self._call_soon(self._loop_self_reading, (), check_loop=False) + + def _loop_self_reading(self, f=None): + try: +@@ -414,7 +439,7 @@ + f.add_done_callback(self._loop_self_reading) + + def _write_to_self(self): +- self._csock.send(b'x') ++ self._csock.send(b'\0') + + def _start_serving(self, protocol_factory, sock, ssl=None, server=None): + if ssl: +@@ -424,10 +449,15 @@ + try: + if f is not None: + conn, addr = f.result() ++ if self._debug: ++ logger.debug("%r got a new connection from %r: %r", ++ server, addr, conn) + protocol = protocol_factory() + self._make_socket_transport( + conn, protocol, + extra={'peername': addr}, server=server) ++ if self.is_closed(): ++ return + f = self._proactor.accept(sock) + except OSError as exc: + if sock.fileno() != -1: +@@ -448,8 +478,12 @@ + def _process_events(self, event_list): + pass # XXX hard work currently done in poll + +- def _stop_serving(self, sock): ++ def _stop_accept_futures(self): + for future in self._accept_futures.values(): + future.cancel() ++ self._accept_futures.clear() ++ ++ def _stop_serving(self, sock): ++ self._stop_accept_futures() + self._proactor._stop_serving(sock) + sock.close() +diff -r c0e311e010fc Lib/asyncio/queues.py +--- a/Lib/asyncio/queues.py ++++ b/Lib/asyncio/queues.py +@@ -105,7 +105,7 @@ + if self._maxsize <= 0: + return False + else: +- return self.qsize() == self._maxsize ++ return self.qsize() >= self._maxsize + + @coroutine + def put(self, item): +@@ -126,7 +126,7 @@ + self._put(item) + getter.set_result(self._get()) + +- elif self._maxsize > 0 and self._maxsize == self.qsize(): ++ elif self._maxsize > 0 and self._maxsize <= self.qsize(): + waiter = futures.Future(loop=self._loop) + + self._putters.append((item, waiter)) +@@ -152,7 +152,7 @@ + self._put(item) + getter.set_result(self._get()) + +- elif self._maxsize > 0 and self._maxsize == self.qsize(): ++ elif self._maxsize > 0 and self._maxsize <= self.qsize(): + raise QueueFull + else: + self._put(item) +@@ -173,7 +173,7 @@ + # run, we need to defer the put for a tick to ensure that + # getters and putters alternate perfectly. See + # ChannelTest.test_wait. +- self._loop.call_soon(putter.set_result, None) ++ self._loop.call_soon(putter._set_result_unless_cancelled, None) + + return self._get() + +diff -r c0e311e010fc Lib/asyncio/selector_events.py +--- a/Lib/asyncio/selector_events.py ++++ b/Lib/asyncio/selector_events.py +@@ -23,6 +23,17 @@ + from .log import logger + + ++def _test_selector_event(selector, fd, event): ++ # Test if the selector is monitoring 'event' events ++ # for the file descriptor 'fd'. ++ try: ++ key = selector.get_key(fd) ++ except KeyError: ++ return False ++ else: ++ return bool(key.events & event) ++ ++ + class BaseSelectorEventLoop(base_events.BaseEventLoop): + """Selector event loop. + +@@ -51,15 +62,18 @@ + server_side, server_hostname, extra, server) + + def _make_datagram_transport(self, sock, protocol, +- address=None, extra=None): +- return _SelectorDatagramTransport(self, sock, protocol, address, extra) ++ address=None, waiter=None, extra=None): ++ return _SelectorDatagramTransport(self, sock, protocol, ++ address, waiter, extra) + + def close(self): ++ if self.is_closed(): ++ return ++ super().close() ++ self._close_self_pipe() + if self._selector is not None: +- self._close_self_pipe() + self._selector.close() + self._selector = None +- super().close() + + def _socketpair(self): + raise NotImplementedError +@@ -80,11 +94,20 @@ + self._internal_fds += 1 + self.add_reader(self._ssock.fileno(), self._read_from_self) + ++ def _process_self_data(self, data): ++ pass ++ + def _read_from_self(self): +- try: +- self._ssock.recv(1) +- except (BlockingIOError, InterruptedError): +- pass ++ while True: ++ try: ++ data = self._ssock.recv(4096) ++ if not data: ++ break ++ self._process_self_data(data) ++ except InterruptedError: ++ continue ++ except BlockingIOError: ++ break + + def _write_to_self(self): + # This may be called from a different thread, possibly after +@@ -95,9 +118,12 @@ + csock = self._csock + if csock is not None: + try: +- csock.send(b'x') ++ csock.send(b'\0') + except OSError: +- pass ++ if self._debug: ++ logger.debug("Fail to write a null byte into the " ++ "self-pipe socket", ++ exc_info=True) + + def _start_serving(self, protocol_factory, sock, + sslcontext=None, server=None): +@@ -108,6 +134,9 @@ + sslcontext=None, server=None): + try: + conn, addr = sock.accept() ++ if self._debug: ++ logger.debug("%r got a new connection from %r: %r", ++ server, addr, conn) + conn.setblocking(False) + except (BlockingIOError, InterruptedError, ConnectionAbortedError): + pass # False alarm. +@@ -143,8 +172,7 @@ + + def add_reader(self, fd, callback, *args): + """Add a reader callback.""" +- if self._selector is None: +- raise RuntimeError('Event loop is closed') ++ self._check_closed() + handle = events.Handle(callback, args, self) + try: + key = self._selector.get_key(fd) +@@ -160,7 +188,7 @@ + + def remove_reader(self, fd): + """Remove a reader callback.""" +- if self._selector is None: ++ if self.is_closed(): + return False + try: + key = self._selector.get_key(fd) +@@ -182,8 +210,7 @@ + + def add_writer(self, fd, callback, *args): + """Add a writer callback..""" +- if self._selector is None: +- raise RuntimeError('Event loop is closed') ++ self._check_closed() + handle = events.Handle(callback, args, self) + try: + key = self._selector.get_key(fd) +@@ -199,7 +226,7 @@ + + def remove_writer(self, fd): + """Remove a writer callback.""" +- if self._selector is None: ++ if self.is_closed(): + return False + try: + key = self._selector.get_key(fd) +@@ -221,7 +248,14 @@ + return False + + def sock_recv(self, sock, n): +- """XXX""" ++ """Receive data from the socket. ++ ++ The return value is a bytes object representing the data received. ++ The maximum amount of data to be received at once is specified by ++ nbytes. ++ ++ This method is a coroutine. ++ """ + fut = futures.Future(loop=self) + self._sock_recv(fut, False, sock, n) + return fut +@@ -248,7 +282,16 @@ + fut.set_result(data) + + def sock_sendall(self, sock, data): +- """XXX""" ++ """Send data to the socket. ++ ++ The socket must be connected to a remote socket. This method continues ++ to send data from data until either all data has been sent or an ++ error occurs. None is returned on success. On error, an exception is ++ raised, and there is no way to determine how much data, if any, was ++ successfully processed by the receiving end of the connection. ++ ++ This method is a coroutine. ++ """ + fut = futures.Future(loop=self) + if data: + self._sock_sendall(fut, False, sock, data) +@@ -280,7 +323,16 @@ + self.add_writer(fd, self._sock_sendall, fut, True, sock, data) + + def sock_connect(self, sock, address): +- """XXX""" ++ """Connect to a remote socket at address. ++ ++ The address must be already resolved to avoid the trap of hanging the ++ entire event loop when the address requires doing a DNS lookup. For ++ example, it must be an IP address, not an hostname, for AF_INET and ++ AF_INET6 address families. Use getaddrinfo() to resolve the hostname ++ asynchronously. ++ ++ This method is a coroutine. ++ """ + fut = futures.Future(loop=self) + try: + base_events._check_resolved_address(sock, address) +@@ -313,7 +365,15 @@ + fut.set_result(None) + + def sock_accept(self, sock): +- """XXX""" ++ """Accept a connection. ++ ++ The socket must be bound to an address and listening for connections. ++ The return value is a pair (conn, address) where conn is a new socket ++ object usable to send and receive data on the connection, and address ++ is the address bound to the socket on the other end of the connection. ++ ++ This method is a coroutine. ++ """ + fut = futures.Future(loop=self) + self._sock_accept(fut, False, sock) + return fut +@@ -378,7 +438,27 @@ + self._conn_lost = 0 # Set when call to connection_lost scheduled. + self._closing = False # Set when close() called. + if self._server is not None: +- self._server.attach(self) ++ self._server._attach() ++ ++ def __repr__(self): ++ info = [self.__class__.__name__, 'fd=%s' % self._sock_fd] ++ polling = _test_selector_event(self._loop._selector, ++ self._sock_fd, selectors.EVENT_READ) ++ if polling: ++ info.append('read=polling') ++ else: ++ info.append('read=idle') ++ ++ polling = _test_selector_event(self._loop._selector, ++ self._sock_fd, selectors.EVENT_WRITE) ++ if polling: ++ state = 'polling' ++ else: ++ state = 'idle' ++ ++ bufsize = self.get_write_buffer_size() ++ info.append('write=<%s, bufsize=%s>' % (state, bufsize)) ++ return '<%s>' % ' '.join(info) + + def abort(self): + self._force_close(None) +@@ -394,7 +474,10 @@ + + def _fatal_error(self, exc, message='Fatal error on transport'): + # Should be called from exception handler only. +- if not isinstance(exc, (BrokenPipeError, ConnectionResetError)): ++ if isinstance(exc, (BrokenPipeError, ConnectionResetError)): ++ if self._loop.get_debug(): ++ logger.debug("%r: %s", self, message, exc_info=True) ++ else: + self._loop.call_exception_handler({ + 'message': message, + 'exception': exc, +@@ -425,7 +508,7 @@ + self._loop = None + server = self._server + if server is not None: +- server.detach(self) ++ server._detach() + self._server = None + + def get_write_buffer_size(self): +@@ -443,7 +526,8 @@ + self._loop.add_reader(self._sock_fd, self._read_ready) + self._loop.call_soon(self._protocol.connection_made, self) + if waiter is not None: +- self._loop.call_soon(waiter.set_result, None) ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(waiter._set_result_unless_cancelled, None) + + def pause_reading(self): + if self._closing: +@@ -452,6 +536,8 @@ + raise RuntimeError('Already paused') + self._paused = True + self._loop.remove_reader(self._sock_fd) ++ if self._loop.get_debug(): ++ logger.debug("%r pauses reading", self) + + def resume_reading(self): + if not self._paused: +@@ -460,6 +546,8 @@ + if self._closing: + return + self._loop.add_reader(self._sock_fd, self._read_ready) ++ if self._loop.get_debug(): ++ logger.debug("%r resumes reading", self) + + def _read_ready(self): + try: +@@ -472,6 +560,8 @@ + if data: + self._protocol.data_received(data) + else: ++ if self._loop.get_debug(): ++ logger.debug("%r received EOF", self) + keep_open = self._protocol.eof_received() + if keep_open: + # We're keeping the connection open so the +@@ -598,31 +688,37 @@ + # SSL-specific extra info. (peercert is set later) + self._extra.update(sslcontext=sslcontext) + +- self._on_handshake() ++ if self._loop.get_debug(): ++ logger.debug("%r starts SSL handshake", self) ++ start_time = self._loop.time() ++ else: ++ start_time = None ++ self._on_handshake(start_time) + +- def _on_handshake(self): ++ def _on_handshake(self, start_time): + try: + self._sock.do_handshake() + except ssl.SSLWantReadError: +- self._loop.add_reader(self._sock_fd, self._on_handshake) ++ self._loop.add_reader(self._sock_fd, ++ self._on_handshake, start_time) + return + except ssl.SSLWantWriteError: +- self._loop.add_writer(self._sock_fd, self._on_handshake) ++ self._loop.add_writer(self._sock_fd, ++ self._on_handshake, start_time) + return +- except Exception as exc: ++ except BaseException as exc: ++ if self._loop.get_debug(): ++ logger.warning("%r: SSL handshake failed", ++ self, exc_info=True) + self._loop.remove_reader(self._sock_fd) + self._loop.remove_writer(self._sock_fd) + self._sock.close() + if self._waiter is not None: + self._waiter.set_exception(exc) +- return +- except BaseException as exc: +- self._loop.remove_reader(self._sock_fd) +- self._loop.remove_writer(self._sock_fd) +- self._sock.close() +- if self._waiter is not None: +- self._waiter.set_exception(exc) +- raise ++ if isinstance(exc, Exception): ++ return ++ else: ++ raise + + self._loop.remove_reader(self._sock_fd) + self._loop.remove_writer(self._sock_fd) +@@ -636,6 +732,10 @@ + try: + ssl.match_hostname(peercert, self._server_hostname) + except Exception as exc: ++ if self._loop.get_debug(): ++ logger.warning("%r: SSL handshake failed " ++ "on matching the hostname", ++ self, exc_info=True) + self._sock.close() + if self._waiter is not None: + self._waiter.set_exception(exc) +@@ -652,7 +752,13 @@ + self._loop.add_reader(self._sock_fd, self._read_ready) + self._loop.call_soon(self._protocol.connection_made, self) + if self._waiter is not None: +- self._loop.call_soon(self._waiter.set_result, None) ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(self._waiter._set_result_unless_cancelled, ++ None) ++ ++ if self._loop.get_debug(): ++ dt = self._loop.time() - start_time ++ logger.debug("%r: SSL handshake took %.1f ms", self, dt * 1e3) + + def pause_reading(self): + # XXX This is a bit icky, given the comment at the top of +@@ -667,14 +773,18 @@ + raise RuntimeError('Already paused') + self._paused = True + self._loop.remove_reader(self._sock_fd) ++ if self._loop.get_debug(): ++ logger.debug("%r pauses reading", self) + + def resume_reading(self): + if not self._paused: +- raise ('Not paused') ++ raise RuntimeError('Not paused') + self._paused = False + if self._closing: + return + self._loop.add_reader(self._sock_fd, self._read_ready) ++ if self._loop.get_debug(): ++ logger.debug("%r resumes reading", self) + + def _read_ready(self): + if self._write_wants_read: +@@ -699,6 +809,8 @@ + self._protocol.data_received(data) + else: + try: ++ if self._loop.get_debug(): ++ logger.debug("%r received EOF", self) + keep_open = self._protocol.eof_received() + if keep_open: + logger.warning('returning true from eof_received() ' +@@ -767,11 +879,15 @@ + + _buffer_factory = collections.deque + +- def __init__(self, loop, sock, protocol, address=None, extra=None): ++ def __init__(self, loop, sock, protocol, address=None, ++ waiter=None, extra=None): + super().__init__(loop, sock, protocol, extra) + self._address = address + self._loop.add_reader(self._sock_fd, self._read_ready) + self._loop.call_soon(self._protocol.connection_made, self) ++ if waiter is not None: ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(waiter._set_result_unless_cancelled, None) + + def get_write_buffer_size(self): + return sum(len(data) for data, _ in self._buffer) +diff -r c0e311e010fc Lib/asyncio/streams.py +--- a/Lib/asyncio/streams.py ++++ b/Lib/asyncio/streams.py +@@ -10,10 +10,12 @@ + if hasattr(socket, 'AF_UNIX'): + __all__.extend(['open_unix_connection', 'start_unix_server']) + ++from . import coroutines + from . import events + from . import futures + from . import protocols +-from . import tasks ++from .coroutines import coroutine ++from .log import logger + + + _DEFAULT_LIMIT = 2**16 +@@ -33,7 +35,7 @@ + self.expected = expected + + +-@tasks.coroutine ++@coroutine + def open_connection(host=None, port=None, *, + loop=None, limit=_DEFAULT_LIMIT, **kwds): + """A wrapper for create_connection() returning a (reader, writer) pair. +@@ -63,7 +65,7 @@ + return reader, writer + + +-@tasks.coroutine ++@coroutine + def start_server(client_connected_cb, host=None, port=None, *, + loop=None, limit=_DEFAULT_LIMIT, **kwds): + """Start a socket server, call back for each client connected. +@@ -102,7 +104,7 @@ + if hasattr(socket, 'AF_UNIX'): + # UNIX Domain Sockets are supported on this platform + +- @tasks.coroutine ++ @coroutine + def open_unix_connection(path=None, *, + loop=None, limit=_DEFAULT_LIMIT, **kwds): + """Similar to `open_connection` but works with UNIX Domain Sockets.""" +@@ -116,7 +118,7 @@ + return reader, writer + + +- @tasks.coroutine ++ @coroutine + def start_unix_server(client_connected_cb, path=None, *, + loop=None, limit=_DEFAULT_LIMIT, **kwds): + """Similar to `start_server` but works with UNIX Domain Sockets.""" +@@ -139,23 +141,27 @@ + resume_reading() and connection_lost(). If the subclass overrides + these it must call the super methods. + +- StreamWriter.drain() must check for error conditions and then call +- _make_drain_waiter(), which will return either () or a Future +- depending on the paused state. ++ StreamWriter.drain() must wait for _drain_helper() coroutine. + """ + + def __init__(self, loop=None): + self._loop = loop # May be None; we may never need it. + self._paused = False + self._drain_waiter = None ++ self._connection_lost = False + + def pause_writing(self): + assert not self._paused + self._paused = True ++ if self._loop.get_debug(): ++ logger.debug("%r pauses writing", self) + + def resume_writing(self): + assert self._paused + self._paused = False ++ if self._loop.get_debug(): ++ logger.debug("%r resumes writing", self) ++ + waiter = self._drain_waiter + if waiter is not None: + self._drain_waiter = None +@@ -163,6 +169,7 @@ + waiter.set_result(None) + + def connection_lost(self, exc): ++ self._connection_lost = True + # Wake up the writer if currently paused. + if not self._paused: + return +@@ -177,14 +184,17 @@ + else: + waiter.set_exception(exc) + +- def _make_drain_waiter(self): ++ @coroutine ++ def _drain_helper(self): ++ if self._connection_lost: ++ raise ConnectionResetError('Connection lost') + if not self._paused: +- return () ++ return + waiter = self._drain_waiter + assert waiter is None or waiter.cancelled() + waiter = futures.Future(loop=self._loop) + self._drain_waiter = waiter +- return waiter ++ yield from waiter + + + class StreamReaderProtocol(FlowControlMixin, protocols.Protocol): +@@ -210,8 +220,8 @@ + self._loop) + res = self._client_connected_cb(self._stream_reader, + self._stream_writer) +- if tasks.iscoroutine(res): +- tasks.Task(res, loop=self._loop) ++ if coroutines.iscoroutine(res): ++ self._loop.create_task(res) + + def connection_lost(self, exc): + if exc is None: +@@ -240,9 +250,17 @@ + def __init__(self, transport, protocol, reader, loop): + self._transport = transport + self._protocol = protocol ++ # drain() expects that the reader has a exception() method ++ assert reader is None or isinstance(reader, StreamReader) + self._reader = reader + self._loop = loop + ++ def __repr__(self): ++ info = [self.__class__.__name__, 'transport=%r' % self._transport] ++ if self._reader is not None: ++ info.append('reader=%r' % self._reader) ++ return '<%s>' % ' '.join(info) ++ + @property + def transport(self): + return self._transport +@@ -265,26 +283,20 @@ + def get_extra_info(self, name, default=None): + return self._transport.get_extra_info(name, default) + ++ @coroutine + def drain(self): +- """This method has an unusual return value. ++ """Flush the write buffer. + + The intended use is to write + + w.write(data) + yield from w.drain() +- +- When there's nothing to wait for, drain() returns (), and the +- yield-from continues immediately. When the transport buffer +- is full (the protocol is paused), drain() creates and returns +- a Future and the yield-from will block until that Future is +- completed, which will happen when the buffer is (partially) +- drained and the protocol is resumed. + """ +- if self._reader is not None and self._reader._exception is not None: +- raise self._reader._exception +- if self._transport._conn_lost: # Uses private variable. +- raise ConnectionResetError('Connection lost') +- return self._protocol._make_drain_waiter() ++ if self._reader is not None: ++ exc = self._reader.exception() ++ if exc is not None: ++ raise exc ++ yield from self._protocol._drain_helper() + + + class StreamReader: +@@ -373,7 +385,7 @@ + 'already waiting for incoming data' % func_name) + return futures.Future(loop=self._loop) + +- @tasks.coroutine ++ @coroutine + def readline(self): + if self._exception is not None: + raise self._exception +@@ -410,7 +422,7 @@ + self._maybe_resume_transport() + return bytes(line) + +- @tasks.coroutine ++ @coroutine + def read(self, n=-1): + if self._exception is not None: + raise self._exception +@@ -449,7 +461,7 @@ + self._maybe_resume_transport() + return data + +- @tasks.coroutine ++ @coroutine + def readexactly(self, n): + if self._exception is not None: + raise self._exception +diff -r c0e311e010fc Lib/asyncio/subprocess.py +--- a/Lib/asyncio/subprocess.py ++++ b/Lib/asyncio/subprocess.py +@@ -8,6 +8,8 @@ + from . import protocols + from . import streams + from . import tasks ++from .coroutines import coroutine ++from .log import logger + + + PIPE = subprocess.PIPE +@@ -27,6 +29,16 @@ + self._waiters = collections.deque() + self._transport = None + ++ def __repr__(self): ++ info = [self.__class__.__name__] ++ if self.stdin is not None: ++ info.append('stdin=%r' % self.stdin) ++ if self.stdout is not None: ++ info.append('stdout=%r' % self.stdout) ++ if self.stderr is not None: ++ info.append('stderr=%r' % self.stderr) ++ return '<%s>' % ' '.join(info) ++ + def connection_made(self, transport): + self._transport = transport + if transport.get_pipe_transport(1): +@@ -90,11 +102,14 @@ + self.stderr = protocol.stderr + self.pid = transport.get_pid() + ++ def __repr__(self): ++ return '<%s %s>' % (self.__class__.__name__, self.pid) ++ + @property + def returncode(self): + return self._transport.get_returncode() + +- @tasks.coroutine ++ @coroutine + def wait(self): + """Wait until the process exit and return the process return code.""" + returncode = self._transport.get_returncode() +@@ -122,17 +137,29 @@ + self._check_alive() + self._transport.kill() + +- @tasks.coroutine ++ @coroutine + def _feed_stdin(self, input): ++ debug = self._loop.get_debug() + self.stdin.write(input) +- yield from self.stdin.drain() ++ if debug: ++ logger.debug('%r communicate: feed stdin (%s bytes)', ++ self, len(input)) ++ try: ++ yield from self.stdin.drain() ++ except (BrokenPipeError, ConnectionResetError) as exc: ++ # communicate() ignores BrokenPipeError and ConnectionResetError ++ if debug: ++ logger.debug('%r communicate: stdin got %r', self, exc) ++ ++ if debug: ++ logger.debug('%r communicate: close stdin', self) + self.stdin.close() + +- @tasks.coroutine ++ @coroutine + def _noop(self): + return None + +- @tasks.coroutine ++ @coroutine + def _read_stream(self, fd): + transport = self._transport.get_pipe_transport(fd) + if fd == 2: +@@ -140,11 +167,17 @@ + else: + assert fd == 1 + stream = self.stdout ++ if self._loop.get_debug(): ++ name = 'stdout' if fd == 1 else 'stderr' ++ logger.debug('%r communicate: read %s', self, name) + output = yield from stream.read() ++ if self._loop.get_debug(): ++ name = 'stdout' if fd == 1 else 'stderr' ++ logger.debug('%r communicate: close %s', self, name) + transport.close() + return output + +- @tasks.coroutine ++ @coroutine + def communicate(self, input=None): + if input: + stdin = self._feed_stdin(input) +@@ -164,7 +197,7 @@ + return (stdout, stderr) + + +-@tasks.coroutine ++@coroutine + def create_subprocess_shell(cmd, stdin=None, stdout=None, stderr=None, + loop=None, limit=streams._DEFAULT_LIMIT, **kwds): + if loop is None: +@@ -178,7 +211,7 @@ + yield from protocol.waiter + return Process(transport, protocol, loop) + +-@tasks.coroutine ++@coroutine + def create_subprocess_exec(program, *args, stdin=None, stdout=None, + stderr=None, loop=None, + limit=streams._DEFAULT_LIMIT, **kwds): +diff -r c0e311e010fc Lib/asyncio/tasks.py +--- a/Lib/asyncio/tasks.py ++++ b/Lib/asyncio/tasks.py +@@ -1,7 +1,6 @@ + """Support for tasks, coroutines and the scheduler.""" + +-__all__ = ['coroutine', 'Task', +- 'iscoroutinefunction', 'iscoroutine', ++__all__ = ['Task', + 'FIRST_COMPLETED', 'FIRST_EXCEPTION', 'ALL_COMPLETED', + 'wait', 'wait_for', 'as_completed', 'sleep', 'async', + 'gather', 'shield', +@@ -11,122 +10,16 @@ + import functools + import inspect + import linecache +-import os + import sys + import traceback + import weakref + ++from . import coroutines + from . import events + from . import futures +-from .log import logger ++from .coroutines import coroutine + +-# If you set _DEBUG to true, @coroutine will wrap the resulting +-# generator objects in a CoroWrapper instance (defined below). That +-# instance will log a message when the generator is never iterated +-# over, which may happen when you forget to use "yield from" with a +-# coroutine call. Note that the value of the _DEBUG flag is taken +-# when the decorator is used, so to be of any use it must be set +-# before you define your coroutines. A downside of using this feature +-# is that tracebacks show entries for the CoroWrapper.__next__ method +-# when _DEBUG is true. +-_DEBUG = (not sys.flags.ignore_environment +- and bool(os.environ.get('PYTHONASYNCIODEBUG'))) +- +- +-class CoroWrapper: +- # Wrapper for coroutine in _DEBUG mode. +- +- __slots__ = ['gen', 'func', '__name__', '__doc__', '__weakref__'] +- +- def __init__(self, gen, func): +- assert inspect.isgenerator(gen), gen +- self.gen = gen +- self.func = func +- +- def __iter__(self): +- return self +- +- def __next__(self): +- return next(self.gen) +- +- def send(self, *value): +- # We use `*value` because of a bug in CPythons prior +- # to 3.4.1. See issue #21209 and test_yield_from_corowrapper +- # for details. This workaround should be removed in 3.5.0. +- if len(value) == 1: +- value = value[0] +- return self.gen.send(value) +- +- def throw(self, exc): +- return self.gen.throw(exc) +- +- def close(self): +- return self.gen.close() +- +- @property +- def gi_frame(self): +- return self.gen.gi_frame +- +- @property +- def gi_running(self): +- return self.gen.gi_running +- +- @property +- def gi_code(self): +- return self.gen.gi_code +- +- def __del__(self): +- # Be careful accessing self.gen.frame -- self.gen might not exist. +- gen = getattr(self, 'gen', None) +- frame = getattr(gen, 'gi_frame', None) +- if frame is not None and frame.f_lasti == -1: +- func = self.func +- code = func.__code__ +- filename = code.co_filename +- lineno = code.co_firstlineno +- logger.error( +- 'Coroutine %r defined at %s:%s was never yielded from', +- func.__name__, filename, lineno) +- +- +-def coroutine(func): +- """Decorator to mark coroutines. +- +- If the coroutine is not yielded from before it is destroyed, +- an error message is logged. +- """ +- if inspect.isgeneratorfunction(func): +- coro = func +- else: +- @functools.wraps(func) +- def coro(*args, **kw): +- res = func(*args, **kw) +- if isinstance(res, futures.Future) or inspect.isgenerator(res): +- res = yield from res +- return res +- +- if not _DEBUG: +- wrapper = coro +- else: +- @functools.wraps(func) +- def wrapper(*args, **kwds): +- w = CoroWrapper(coro(*args, **kwds), func) +- w.__name__ = coro.__name__ +- w.__doc__ = coro.__doc__ +- return w +- +- wrapper._is_coroutine = True # For iscoroutinefunction(). +- return wrapper +- +- +-def iscoroutinefunction(func): +- """Return True if func is a decorated coroutine function.""" +- return getattr(func, '_is_coroutine', False) +- +- +-def iscoroutine(obj): +- """Return True if obj is a coroutine object.""" +- return isinstance(obj, CoroWrapper) or inspect.isgenerator(obj) ++_PY34 = (sys.version_info >= (3, 4)) + + + class Task(futures.Future): +@@ -171,25 +64,58 @@ + return {t for t in cls._all_tasks if t._loop is loop} + + def __init__(self, coro, *, loop=None): +- assert iscoroutine(coro), repr(coro) # Not a coroutine function! ++ assert coroutines.iscoroutine(coro), repr(coro) # Not a coroutine function! + super().__init__(loop=loop) ++ if self._source_traceback: ++ del self._source_traceback[-1] + self._coro = iter(coro) # Use the iterator just in case. + self._fut_waiter = None + self._must_cancel = False + self._loop.call_soon(self._step) + self.__class__._all_tasks.add(self) ++ # If False, don't log a message if the task is destroyed whereas its ++ # status is still pending ++ self._log_destroy_pending = True ++ ++ # On Python 3.3 or older, objects with a destructor part of a reference ++ # cycle are never destroyed. It's not more the case on Python 3.4 thanks to ++ # the PEP 442. ++ if _PY34: ++ def __del__(self): ++ if self._state == futures._PENDING and self._log_destroy_pending: ++ context = { ++ 'task': self, ++ 'message': 'Task was destroyed but it is pending!', ++ } ++ if self._source_traceback: ++ context['source_traceback'] = self._source_traceback ++ self._loop.call_exception_handler(context) ++ futures.Future.__del__(self) + + def __repr__(self): +- res = super().__repr__() +- if (self._must_cancel and +- self._state == futures._PENDING and +- ')'.format(self._coro.__name__) + res[i:] +- return res ++ info = [] ++ if self._must_cancel: ++ info.append('cancelling') ++ else: ++ info.append(self._state.lower()) ++ ++ coro = coroutines._format_coroutine(self._coro) ++ info.append('coro=<%s>' % coro) ++ ++ if self._source_traceback: ++ frame = self._source_traceback[-1] ++ info.append('created at %s:%s' % (frame[0], frame[1])) ++ ++ if self._state == futures._FINISHED: ++ info.append(self._format_result()) ++ ++ if self._callbacks: ++ info.append(self._format_callbacks()) ++ ++ if self._fut_waiter is not None: ++ info.append('wait_for=%r' % self._fut_waiter) ++ ++ return '<%s %s>' % (self.__class__.__name__, ' '.join(info)) + + def get_stack(self, *, limit=None): + """Return the list of stack frames for this task's coroutine. +@@ -269,9 +195,9 @@ + print(line, file=file, end='') + + def cancel(self): +- """Request that a task to cancel itself. ++ """Request this task to cancel itself. + +- This arranges for a CancellationError to be thrown into the ++ This arranges for a CancelledError to be thrown into the + wrapped coroutine on the next cycle through the event loop. + The coroutine then has a chance to clean up or even deny + the request using try/except/finally. +@@ -387,6 +313,8 @@ + def wait(fs, *, loop=None, timeout=None, return_when=ALL_COMPLETED): + """Wait for the Futures and coroutines given by fs to complete. + ++ The sequence futures must not be empty. ++ + Coroutines will be wrapped in Tasks. + + Returns two sets of Future: (done, pending). +@@ -398,18 +326,18 @@ + Note: This does not raise TimeoutError! Futures that aren't done + when the timeout occurs are returned in the second set. + """ +- if isinstance(fs, futures.Future) or iscoroutine(fs): ++ if isinstance(fs, futures.Future) or coroutines.iscoroutine(fs): + raise TypeError("expect a list of futures, not %s" % type(fs).__name__) + if not fs: + raise ValueError('Set of coroutines/Futures is empty.') ++ if return_when not in (FIRST_COMPLETED, FIRST_EXCEPTION, ALL_COMPLETED): ++ raise ValueError('Invalid return_when value: {}'.format(return_when)) + + if loop is None: + loop = events.get_event_loop() + + fs = {async(f, loop=loop) for f in set(fs)} + +- if return_when not in (FIRST_COMPLETED, FIRST_EXCEPTION, ALL_COMPLETED): +- raise ValueError('Invalid return_when value: {}'.format(return_when)) + return (yield from _wait(fs, timeout, return_when, loop)) + + +@@ -520,7 +448,7 @@ + + Note: The futures 'f' are not necessarily members of fs. + """ +- if isinstance(fs, futures.Future) or iscoroutine(fs): ++ if isinstance(fs, futures.Future) or coroutines.iscoroutine(fs): + raise TypeError("expect a list of futures, not %s" % type(fs).__name__) + loop = loop if loop is not None else events.get_event_loop() + todo = {async(f, loop=loop) for f in set(fs)} +@@ -562,7 +490,8 @@ + def sleep(delay, result=None, *, loop=None): + """Coroutine that completes after a given time (in seconds).""" + future = futures.Future(loop=loop) +- h = future._loop.call_later(delay, future.set_result, result) ++ h = future._loop.call_later(delay, ++ future._set_result_unless_cancelled, result) + try: + return (yield from future) + finally: +@@ -578,8 +507,13 @@ + if loop is not None and loop is not coro_or_future._loop: + raise ValueError('loop argument must agree with Future') + return coro_or_future +- elif iscoroutine(coro_or_future): +- return Task(coro_or_future, loop=loop) ++ elif coroutines.iscoroutine(coro_or_future): ++ if loop is None: ++ loop = events.get_event_loop() ++ task = loop.create_task(coro_or_future) ++ if task._source_traceback: ++ del task._source_traceback[-1] ++ return task + else: + raise TypeError('A Future or coroutine is required') + +@@ -624,21 +558,33 @@ + prevent the cancellation of one child to cause other children to + be cancelled.) + """ +- arg_to_fut = {arg: async(arg, loop=loop) for arg in set(coros_or_futures)} +- children = [arg_to_fut[arg] for arg in coros_or_futures] +- n = len(children) +- if n == 0: ++ if not coros_or_futures: + outer = futures.Future(loop=loop) + outer.set_result([]) + return outer +- if loop is None: +- loop = children[0]._loop +- for fut in children: +- if fut._loop is not loop: +- raise ValueError("futures are tied to different event loops") ++ ++ arg_to_fut = {} ++ for arg in set(coros_or_futures): ++ if not isinstance(arg, futures.Future): ++ fut = async(arg, loop=loop) ++ if loop is None: ++ loop = fut._loop ++ # The caller cannot control this future, the "destroy pending task" ++ # warning should not be emitted. ++ fut._log_destroy_pending = False ++ else: ++ fut = arg ++ if loop is None: ++ loop = fut._loop ++ elif fut._loop is not loop: ++ raise ValueError("futures are tied to different event loops") ++ arg_to_fut[arg] = fut ++ ++ children = [arg_to_fut[arg] for arg in coros_or_futures] ++ nchildren = len(children) + outer = _GatheringFuture(children, loop=loop) + nfinished = 0 +- results = [None] * n ++ results = [None] * nchildren + + def _done_callback(i, fut): + nonlocal nfinished +@@ -661,7 +607,7 @@ + res = fut._result + results[i] = res + nfinished += 1 +- if nfinished == n: ++ if nfinished == nchildren: + outer.set_result(results) + + for i, fut in enumerate(children): +diff -r c0e311e010fc Lib/asyncio/test_utils.py +--- a/Lib/asyncio/test_utils.py ++++ b/Lib/asyncio/test_utils.py +@@ -3,6 +3,7 @@ + import collections + import contextlib + import io ++import logging + import os + import re + import socket +@@ -11,6 +12,7 @@ + import tempfile + import threading + import time ++import unittest + from unittest import mock + + from http.server import HTTPServer +@@ -26,6 +28,8 @@ + from . import futures + from . import selectors + from . import tasks ++from .coroutines import coroutine ++from .log import logger + + + if sys.platform == 'win32': # pragma: no cover +@@ -42,11 +46,14 @@ + + + def run_briefly(loop): +- @tasks.coroutine ++ @coroutine + def once(): + pass + gen = once() +- t = tasks.Task(gen, loop=loop) ++ t = loop.create_task(gen) ++ # Don't log a warning if the task is not done after run_until_complete(). ++ # It occurs if the loop is stopped or if a task raises a BaseException. ++ t._log_destroy_pending = False + try: + loop.run_until_complete(t) + finally: +@@ -372,3 +379,41 @@ + """ + def __eq__(self, other): + return bool(re.search(str(self), other, re.S)) ++ ++ ++def get_function_source(func): ++ source = events._get_function_source(func) ++ if source is None: ++ raise ValueError("unable to get the source of %r" % (func,)) ++ return source ++ ++ ++class TestCase(unittest.TestCase): ++ def set_event_loop(self, loop, *, cleanup=True): ++ assert loop is not None ++ # ensure that the event loop is passed explicitly in asyncio ++ events.set_event_loop(None) ++ if cleanup: ++ self.addCleanup(loop.close) ++ ++ def new_test_loop(self, gen=None): ++ loop = TestLoop(gen) ++ self.set_event_loop(loop) ++ return loop ++ ++ def tearDown(self): ++ events.set_event_loop(None) ++ ++ ++@contextlib.contextmanager ++def disable_logger(): ++ """Context manager to disable asyncio logger. ++ ++ For example, it can be used to ignore warnings in debug mode. ++ """ ++ old_level = logger.level ++ try: ++ logger.setLevel(logging.CRITICAL+1) ++ yield ++ finally: ++ logger.setLevel(old_level) +diff -r c0e311e010fc Lib/asyncio/unix_events.py +--- a/Lib/asyncio/unix_events.py ++++ b/Lib/asyncio/unix_events.py +@@ -16,8 +16,9 @@ + from . import constants + from . import events + from . import selector_events +-from . import tasks ++from . import selectors + from . import transports ++from .coroutines import coroutine + from .log import logger + + +@@ -30,6 +31,11 @@ + raise ImportError('Signals are not really supported on Windows') + + ++def _sighandler_noop(signum, frame): ++ """Dummy signal handler.""" ++ pass ++ ++ + class _UnixSelectorEventLoop(selector_events.BaseSelectorEventLoop): + """Unix event loop. + +@@ -44,9 +50,16 @@ + return socket.socketpair() + + def close(self): ++ super().close() + for sig in list(self._signal_handlers): + self.remove_signal_handler(sig) +- super().close() ++ ++ def _process_self_data(self, data): ++ for signum in data: ++ if not signum: ++ # ignore null bytes written by _write_to_self() ++ continue ++ self._handle_signal(signum) + + def add_signal_handler(self, sig, callback, *args): + """Add a handler for a signal. UNIX only. +@@ -61,14 +74,18 @@ + # event loop running in another thread cannot add a signal + # handler. + signal.set_wakeup_fd(self._csock.fileno()) +- except ValueError as exc: ++ except (ValueError, OSError) as exc: + raise RuntimeError(str(exc)) + + handle = events.Handle(callback, args, self) + self._signal_handlers[sig] = handle + + try: +- signal.signal(sig, self._handle_signal) ++ # Register a dummy signal handler to ask Python to write the signal ++ # number in the wakup file descriptor. _process_self_data() will ++ # read signal numbers from this file descriptor to handle signals. ++ signal.signal(sig, _sighandler_noop) ++ + # Set SA_RESTART to limit EINTR occurrences. + signal.siginterrupt(sig, False) + except OSError as exc: +@@ -76,7 +93,7 @@ + if not self._signal_handlers: + try: + signal.set_wakeup_fd(-1) +- except ValueError as nexc: ++ except (ValueError, OSError) as nexc: + logger.info('set_wakeup_fd(-1) failed: %s', nexc) + + if exc.errno == errno.EINVAL: +@@ -84,7 +101,7 @@ + else: + raise + +- def _handle_signal(self, sig, arg): ++ def _handle_signal(self, sig): + """Internal helper that is the actual signal handler.""" + handle = self._signal_handlers.get(sig) + if handle is None: +@@ -121,7 +138,7 @@ + if not self._signal_handlers: + try: + signal.set_wakeup_fd(-1) +- except ValueError as exc: ++ except (ValueError, OSError) as exc: + logger.info('set_wakeup_fd(-1) failed: %s', exc) + + return True +@@ -147,7 +164,7 @@ + extra=None): + return _UnixWritePipeTransport(self, pipe, protocol, waiter, extra) + +- @tasks.coroutine ++ @coroutine + def _make_subprocess_transport(self, protocol, args, shell, + stdin, stdout, stderr, bufsize, + extra=None, **kwargs): +@@ -164,7 +181,7 @@ + def _child_watcher_callback(self, pid, returncode, transp): + self.call_soon_threadsafe(transp._process_exited, returncode) + +- @tasks.coroutine ++ @coroutine + def create_unix_connection(self, protocol_factory, path, *, + ssl=None, sock=None, + server_hostname=None): +@@ -199,7 +216,7 @@ + sock, protocol_factory, ssl, server_hostname) + return transport, protocol + +- @tasks.coroutine ++ @coroutine + def create_unix_server(self, protocol_factory, path=None, *, + sock=None, backlog=100, ssl=None): + if isinstance(ssl, bool): +@@ -223,6 +240,9 @@ + raise OSError(errno.EADDRINUSE, msg) from None + else: + raise ++ except: ++ sock.close() ++ raise + else: + if sock is None: + raise ValueError( +@@ -266,7 +286,22 @@ + self._loop.add_reader(self._fileno, self._read_ready) + self._loop.call_soon(self._protocol.connection_made, self) + if waiter is not None: +- self._loop.call_soon(waiter.set_result, None) ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(waiter._set_result_unless_cancelled, None) ++ ++ def __repr__(self): ++ info = [self.__class__.__name__, 'fd=%s' % self._fileno] ++ if self._pipe is not None: ++ polling = selector_events._test_selector_event( ++ self._loop._selector, ++ self._fileno, selectors.EVENT_READ) ++ if polling: ++ info.append('polling') ++ else: ++ info.append('idle') ++ else: ++ info.append('closed') ++ return '<%s>' % ' '.join(info) + + def _read_ready(self): + try: +@@ -279,6 +314,8 @@ + if data: + self._protocol.data_received(data) + else: ++ if self._loop.get_debug(): ++ logger.info("%r was closed by peer", self) + self._closing = True + self._loop.remove_reader(self._fileno) + self._loop.call_soon(self._protocol.eof_received) +@@ -350,13 +387,33 @@ + + self._loop.call_soon(self._protocol.connection_made, self) + if waiter is not None: +- self._loop.call_soon(waiter.set_result, None) ++ # wait until protocol.connection_made() has been called ++ self._loop.call_soon(waiter._set_result_unless_cancelled, None) ++ ++ def __repr__(self): ++ info = [self.__class__.__name__, 'fd=%s' % self._fileno] ++ if self._pipe is not None: ++ polling = selector_events._test_selector_event( ++ self._loop._selector, ++ self._fileno, selectors.EVENT_WRITE) ++ if polling: ++ info.append('polling') ++ else: ++ info.append('idle') ++ ++ bufsize = self.get_write_buffer_size() ++ info.append('bufsize=%s' % bufsize) ++ else: ++ info.append('closed') ++ return '<%s>' % ' '.join(info) + + def get_write_buffer_size(self): + return sum(len(data) for data in self._buffer) + + def _read_ready(self): + # Pipe was closed by peer. ++ if self._loop.get_debug(): ++ logger.info("%r was closed by peer", self) + if self._buffer: + self._close(BrokenPipeError()) + else: +@@ -491,7 +548,7 @@ + universal_newlines=False, bufsize=bufsize, **kwargs) + if stdin_w is not None: + stdin.close() +- self._proc.stdin = open(stdin_w.detach(), 'rb', buffering=bufsize) ++ self._proc.stdin = open(stdin_w.detach(), 'wb', buffering=bufsize) + + + class AbstractChildWatcher: +@@ -524,7 +581,7 @@ + process 'pid' terminates. Specifying another callback for the same + process replaces the previous handler. + +- Note: callback() must be thread-safe ++ Note: callback() must be thread-safe. + """ + raise NotImplementedError() + +@@ -680,6 +737,9 @@ + return + + returncode = self._compute_returncode(status) ++ if self._loop.get_debug(): ++ logger.debug('process %s exited with returncode %s', ++ expected_pid, returncode) + + try: + callback, args = self._callbacks.pop(pid) +@@ -777,8 +837,16 @@ + if self._forks: + # It may not be registered yet. + self._zombies[pid] = returncode ++ if self._loop.get_debug(): ++ logger.debug('unknown process %s exited ' ++ 'with returncode %s', ++ pid, returncode) + continue + callback = None ++ else: ++ if self._loop.get_debug(): ++ logger.debug('process %s exited with returncode %s', ++ pid, returncode) + + if callback is None: + logger.warning( +@@ -819,7 +887,7 @@ + self._watcher.attach_loop(loop) + + def get_child_watcher(self): +- """Get the child watcher ++ """Get the watcher for child processes. + + If not yet set, a SafeChildWatcher object is automatically created. + """ +@@ -829,7 +897,7 @@ + return self._watcher + + def set_child_watcher(self, watcher): +- """Set the child watcher""" ++ """Set the watcher for child processes.""" + + assert watcher is None or isinstance(watcher, AbstractChildWatcher) + +diff -r c0e311e010fc Lib/asyncio/windows_events.py +--- a/Lib/asyncio/windows_events.py ++++ b/Lib/asyncio/windows_events.py +@@ -14,8 +14,9 @@ + from . import selector_events + from . import tasks + from . import windows_utils ++from . import _overlapped ++from .coroutines import coroutine + from .log import logger +-from . import _overlapped + + + __all__ = ['SelectorEventLoop', 'ProactorEventLoop', 'IocpProactor', +@@ -37,30 +38,89 @@ + + def __init__(self, ov, *, loop=None): + super().__init__(loop=loop) +- self.ov = ov ++ if self._source_traceback: ++ del self._source_traceback[-1] ++ self._ov = ov ++ ++ def __repr__(self): ++ info = [self._state.lower()] ++ if self._ov is not None: ++ state = 'pending' if self._ov.pending else 'completed' ++ info.append('overlapped=<%s, %#x>' % (state, self._ov.address)) ++ if self._state == futures._FINISHED: ++ info.append(self._format_result()) ++ if self._callbacks: ++ info.append(self._format_callbacks()) ++ return '<%s %s>' % (self.__class__.__name__, ' '.join(info)) ++ ++ def _cancel_overlapped(self): ++ if self._ov is None: ++ return ++ try: ++ self._ov.cancel() ++ except OSError as exc: ++ context = { ++ 'message': 'Cancelling an overlapped future failed', ++ 'exception': exc, ++ 'future': self, ++ } ++ if self._source_traceback: ++ context['source_traceback'] = self._source_traceback ++ self._loop.call_exception_handler(context) ++ self._ov = None + + def cancel(self): +- try: +- self.ov.cancel() +- except OSError: +- pass ++ self._cancel_overlapped() + return super().cancel() + ++ def set_exception(self, exception): ++ super().set_exception(exception) ++ self._cancel_overlapped() ++ ++ def set_result(self, result): ++ super().set_result(result) ++ self._ov = None ++ + + class _WaitHandleFuture(futures.Future): + """Subclass of Future which represents a wait handle.""" + +- def __init__(self, wait_handle, *, loop=None): ++ def __init__(self, handle, wait_handle, *, loop=None): + super().__init__(loop=loop) ++ self._handle = handle + self._wait_handle = wait_handle + +- def cancel(self): +- super().cancel() ++ def _poll(self): ++ # non-blocking wait: use a timeout of 0 millisecond ++ return (_winapi.WaitForSingleObject(self._handle, 0) == ++ _winapi.WAIT_OBJECT_0) ++ ++ def __repr__(self): ++ info = [self._state.lower()] ++ if self._wait_handle: ++ state = 'pending' if self._poll() else 'completed' ++ info.append('wait_handle=<%s, %#x>' % (state, self._wait_handle)) ++ info.append('handle=<%#x>' % self._handle) ++ if self._state == futures._FINISHED: ++ info.append(self._format_result()) ++ if self._callbacks: ++ info.append(self._format_callbacks()) ++ return '<%s %s>' % (self.__class__.__name__, ' '.join(info)) ++ ++ def _unregister(self): ++ if self._wait_handle is None: ++ return + try: + _overlapped.UnregisterWait(self._wait_handle) + except OSError as e: + if e.winerror != _overlapped.ERROR_IO_PENDING: + raise ++ # ERROR_IO_PENDING is not an error, the wait was unregistered ++ self._wait_handle = None ++ ++ def cancel(self): ++ self._unregister() ++ return super().cancel() + + + class PipeServer(object): +@@ -129,7 +189,7 @@ + def _socketpair(self): + return windows_utils.socketpair() + +- @tasks.coroutine ++ @coroutine + def create_pipe_connection(self, protocol_factory, address): + f = self._proactor.connect_pipe(address) + pipe = yield from f +@@ -138,7 +198,7 @@ + extra={'addr': address}) + return trans, protocol + +- @tasks.coroutine ++ @coroutine + def start_serving_pipe(self, protocol_factory, address): + server = PipeServer(address) + +@@ -172,7 +232,7 @@ + self.call_soon(loop) + return [server] + +- @tasks.coroutine ++ @coroutine + def _make_subprocess_transport(self, protocol, args, shell, + stdin, stdout, stderr, bufsize, + extra=None, **kwargs): +@@ -195,6 +255,11 @@ + self._registered = weakref.WeakSet() + self._stopped_serving = weakref.WeakSet() + ++ def __repr__(self): ++ return ('<%s overlapped#=%s result#=%s>' ++ % (self.__class__.__name__, len(self._cache), ++ len(self._results))) ++ + def set_loop(self, loop): + self._loop = loop + +@@ -258,7 +323,7 @@ + conn.settimeout(listener.gettimeout()) + return conn, conn.getpeername() + +- @tasks.coroutine ++ @coroutine + def accept_coro(future, conn): + # Coroutine closing the accept socket if the future is cancelled + try: +@@ -337,23 +402,19 @@ + ov = _overlapped.Overlapped(NULL) + wh = _overlapped.RegisterWaitWithQueue( + handle, self._iocp, ov.address, ms) +- f = _WaitHandleFuture(wh, loop=self._loop) ++ f = _WaitHandleFuture(handle, wh, loop=self._loop) + + def finish_wait_for_handle(trans, key, ov): +- if not f.cancelled(): +- try: +- _overlapped.UnregisterWait(wh) +- except OSError as e: +- if e.winerror != _overlapped.ERROR_IO_PENDING: +- raise + # Note that this second wait means that we should only use + # this with handles types where a successful wait has no + # effect. So events or processes are all right, but locks + # or semaphores are not. Also note if the handle is + # signalled and then quickly reset, then we may return + # False even though we have not timed out. +- return (_winapi.WaitForSingleObject(handle, 0) == +- _winapi.WAIT_OBJECT_0) ++ try: ++ return f._poll() ++ finally: ++ f._unregister() + + self._cache[ov.address] = (f, ov, None, finish_wait_for_handle) + return f +@@ -421,6 +482,13 @@ + _winapi.CloseHandle(key) + ms = 0 + continue ++ ++ if ov.pending: ++ # False alarm: the overlapped operation is not completed. ++ # FIXME: why do we get false alarms? ++ self._cache[address] = (f, ov, obj, callback) ++ continue ++ + if obj in self._stopped_serving: + f.cancel() + elif not f.cancelled(): +@@ -442,7 +510,7 @@ + + def close(self): + # Cancel remaining registered operations. +- for address, (f, ov, obj, callback) in list(self._cache.items()): ++ for address, (fut, ov, obj, callback) in list(self._cache.items()): + if obj is None: + # The operation was started with connect_pipe() which + # queues a task to Windows' thread pool. This cannot +@@ -450,9 +518,17 @@ + del self._cache[address] + else: + try: +- ov.cancel() +- except OSError: +- pass ++ fut.cancel() ++ except OSError as exc: ++ if self._loop is not None: ++ context = { ++ 'message': 'Cancelling a future failed', ++ 'exception': exc, ++ 'future': fut, ++ } ++ if fut._source_traceback: ++ context['source_traceback'] = fut._source_traceback ++ self._loop.call_exception_handler(context) + + while self._cache: + if not self._poll(1): +@@ -463,6 +539,9 @@ + _winapi.CloseHandle(self._iocp) + self._iocp = None + ++ def __del__(self): ++ self.close() ++ + + class _WindowsSubprocessTransport(base_subprocess.BaseSubprocessTransport): + +diff -r c0e311e010fc Lib/asyncio/windows_utils.py +--- a/Lib/asyncio/windows_utils.py ++++ b/Lib/asyncio/windows_utils.py +@@ -51,23 +51,25 @@ + # We create a connected TCP socket. Note the trick with setblocking(0) + # that prevents us from having to create a thread. + lsock = socket.socket(family, type, proto) +- lsock.bind((host, 0)) +- lsock.listen(1) +- # On IPv6, ignore flow_info and scope_id +- addr, port = lsock.getsockname()[:2] +- csock = socket.socket(family, type, proto) +- csock.setblocking(False) + try: +- csock.connect((addr, port)) +- except (BlockingIOError, InterruptedError): +- pass +- except Exception: ++ lsock.bind((host, 0)) ++ lsock.listen(1) ++ # On IPv6, ignore flow_info and scope_id ++ addr, port = lsock.getsockname()[:2] ++ csock = socket.socket(family, type, proto) ++ try: ++ csock.setblocking(False) ++ try: ++ csock.connect((addr, port)) ++ except (BlockingIOError, InterruptedError): ++ pass ++ ssock, _ = lsock.accept() ++ csock.setblocking(True) ++ except: ++ csock.close() ++ raise ++ finally: + lsock.close() +- csock.close() +- raise +- ssock, _ = lsock.accept() +- csock.setblocking(True) +- lsock.close() + return (ssock, csock) + + +diff -r c0e311e010fc Lib/asyncore.py +--- a/Lib/asyncore.py ++++ b/Lib/asyncore.py +@@ -614,6 +614,11 @@ + def __init__(self, fd): + self.fd = os.dup(fd) + ++ def __del__(self): ++ if self.fd >= 0: ++ warnings.warn("unclosed file %r" % self, ResourceWarning) ++ self.close() ++ + def recv(self, *args): + return os.read(self.fd, *args) + +@@ -632,7 +637,10 @@ + write = send + + def close(self): ++ if self.fd < 0: ++ return + os.close(self.fd) ++ self.fd = -1 + + def fileno(self): + return self.fd +diff -r c0e311e010fc Lib/collections/__init__.py +--- a/Lib/collections/__init__.py ++++ b/Lib/collections/__init__.py +@@ -323,6 +323,7 @@ + if isinstance(field_names, str): + field_names = field_names.replace(',', ' ').split() + field_names = list(map(str, field_names)) ++ typename = str(typename) + if rename: + seen = set() + for index, name in enumerate(field_names): +@@ -333,6 +334,8 @@ + field_names[index] = '_%d' % index + seen.add(name) + for name in [typename] + field_names: ++ if type(name) != str: ++ raise TypeError('Type names and field names must be strings') + if not name.isidentifier(): + raise ValueError('Type names and field names must be valid ' + 'identifiers: %r' % name) +diff -r c0e311e010fc Lib/ctypes/test/__init__.py +--- a/Lib/ctypes/test/__init__.py ++++ b/Lib/ctypes/test/__init__.py +@@ -2,7 +2,15 @@ + + use_resources = [] + +-class ResourceDenied(Exception): ++import ctypes ++ctypes_symbols = dir(ctypes) ++ ++def need_symbol(name): ++ return unittest.skipUnless(name in ctypes_symbols, ++ '{!r} is required'.format(name)) ++ ++ ++class ResourceDenied(unittest.SkipTest): + """Test skipped because it requested a disallowed resource. + + This is raised when a test calls requires() for a resource that +diff -r c0e311e010fc Lib/ctypes/test/test_arrays.py +--- a/Lib/ctypes/test/test_arrays.py ++++ b/Lib/ctypes/test/test_arrays.py +@@ -1,6 +1,8 @@ + import unittest + from ctypes import * + ++from ctypes.test import need_symbol ++ + formats = "bBhHiIlLqQfd" + + formats = c_byte, c_ubyte, c_short, c_ushort, c_int, c_uint, \ +@@ -98,20 +100,16 @@ + self.assertEqual(sz[1:4:2], b"o") + self.assertEqual(sz.value, b"foo") + +- try: +- create_unicode_buffer +- except NameError: +- pass +- else: +- def test_from_addressW(self): +- p = create_unicode_buffer("foo") +- sz = (c_wchar * 3).from_address(addressof(p)) +- self.assertEqual(sz[:], "foo") +- self.assertEqual(sz[::], "foo") +- self.assertEqual(sz[::-1], "oof") +- self.assertEqual(sz[::3], "f") +- self.assertEqual(sz[1:4:2], "o") +- self.assertEqual(sz.value, "foo") ++ @need_symbol('create_unicode_buffer') ++ def test_from_addressW(self): ++ p = create_unicode_buffer("foo") ++ sz = (c_wchar * 3).from_address(addressof(p)) ++ self.assertEqual(sz[:], "foo") ++ self.assertEqual(sz[::], "foo") ++ self.assertEqual(sz[::-1], "oof") ++ self.assertEqual(sz[::3], "f") ++ self.assertEqual(sz[1:4:2], "o") ++ self.assertEqual(sz.value, "foo") + + def test_cache(self): + # Array types are cached internally in the _ctypes extension, +diff -r c0e311e010fc Lib/ctypes/test/test_as_parameter.py +--- a/Lib/ctypes/test/test_as_parameter.py ++++ b/Lib/ctypes/test/test_as_parameter.py +@@ -1,5 +1,6 @@ + import unittest + from ctypes import * ++from ctypes.test import need_symbol + import _ctypes_test + + dll = CDLL(_ctypes_test.__file__) +@@ -17,11 +18,8 @@ + def wrap(self, param): + return param + ++ @need_symbol('c_wchar') + def test_wchar_parm(self): +- try: +- c_wchar +- except NameError: +- return + f = dll._testfunc_i_bhilfd + f.argtypes = [c_byte, c_wchar, c_int, c_long, c_float, c_double] + result = f(self.wrap(1), self.wrap("x"), self.wrap(3), self.wrap(4), self.wrap(5.0), self.wrap(6.0)) +diff -r c0e311e010fc Lib/ctypes/test/test_bitfields.py +--- a/Lib/ctypes/test/test_bitfields.py ++++ b/Lib/ctypes/test/test_bitfields.py +@@ -1,4 +1,5 @@ + from ctypes import * ++from ctypes.test import need_symbol + import unittest + import os + +@@ -127,20 +128,18 @@ + result = self.fail_fields(("a", c_char, 1)) + self.assertEqual(result, (TypeError, 'bit fields not allowed for type c_char')) + +- try: +- c_wchar +- except NameError: +- pass +- else: +- result = self.fail_fields(("a", c_wchar, 1)) +- self.assertEqual(result, (TypeError, 'bit fields not allowed for type c_wchar')) +- + class Dummy(Structure): + _fields_ = [] + + result = self.fail_fields(("a", Dummy, 1)) + self.assertEqual(result, (TypeError, 'bit fields not allowed for type Dummy')) + ++ @need_symbol('c_wchar') ++ def test_c_wchar(self): ++ result = self.fail_fields(("a", c_wchar, 1)) ++ self.assertEqual(result, ++ (TypeError, 'bit fields not allowed for type c_wchar')) ++ + def test_single_bitfield_size(self): + for c_typ in int_types: + result = self.fail_fields(("a", c_typ, -1)) +@@ -240,7 +239,7 @@ + _anonymous_ = ["_"] + _fields_ = [("_", X)] + +- @unittest.skipUnless(hasattr(ctypes, "c_uint32"), "c_int32 is required") ++ @need_symbol('c_uint32') + def test_uint32(self): + class X(Structure): + _fields_ = [("a", c_uint32, 32)] +@@ -250,7 +249,7 @@ + x.a = 0xFDCBA987 + self.assertEqual(x.a, 0xFDCBA987) + +- @unittest.skipUnless(hasattr(ctypes, "c_uint64"), "c_int64 is required") ++ @need_symbol('c_uint64') + def test_uint64(self): + class X(Structure): + _fields_ = [("a", c_uint64, 64)] +diff -r c0e311e010fc Lib/ctypes/test/test_buffers.py +--- a/Lib/ctypes/test/test_buffers.py ++++ b/Lib/ctypes/test/test_buffers.py +@@ -1,4 +1,5 @@ + from ctypes import * ++from ctypes.test import need_symbol + import unittest + + class StringBufferTestCase(unittest.TestCase): +@@ -24,39 +25,36 @@ + self.assertEqual(len(bytearray(create_string_buffer(0))), 0) + self.assertEqual(len(bytearray(create_string_buffer(1))), 1) + +- try: +- c_wchar +- except NameError: +- pass +- else: +- def test_unicode_buffer(self): +- b = create_unicode_buffer(32) +- self.assertEqual(len(b), 32) +- self.assertEqual(sizeof(b), 32 * sizeof(c_wchar)) +- self.assertIs(type(b[0]), str) ++ @need_symbol('c_wchar') ++ def test_unicode_buffer(self): ++ b = create_unicode_buffer(32) ++ self.assertEqual(len(b), 32) ++ self.assertEqual(sizeof(b), 32 * sizeof(c_wchar)) ++ self.assertIs(type(b[0]), str) + +- b = create_unicode_buffer("abc") +- self.assertEqual(len(b), 4) # trailing nul char +- self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) +- self.assertIs(type(b[0]), str) +- self.assertEqual(b[0], "a") +- self.assertEqual(b[:], "abc\0") +- self.assertEqual(b[::], "abc\0") +- self.assertEqual(b[::-1], "\0cba") +- self.assertEqual(b[::2], "ac") +- self.assertEqual(b[::5], "a") ++ b = create_unicode_buffer("abc") ++ self.assertEqual(len(b), 4) # trailing nul char ++ self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) ++ self.assertIs(type(b[0]), str) ++ self.assertEqual(b[0], "a") ++ self.assertEqual(b[:], "abc\0") ++ self.assertEqual(b[::], "abc\0") ++ self.assertEqual(b[::-1], "\0cba") ++ self.assertEqual(b[::2], "ac") ++ self.assertEqual(b[::5], "a") + +- def test_unicode_conversion(self): +- b = create_unicode_buffer("abc") +- self.assertEqual(len(b), 4) # trailing nul char +- self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) +- self.assertIs(type(b[0]), str) +- self.assertEqual(b[0], "a") +- self.assertEqual(b[:], "abc\0") +- self.assertEqual(b[::], "abc\0") +- self.assertEqual(b[::-1], "\0cba") +- self.assertEqual(b[::2], "ac") +- self.assertEqual(b[::5], "a") ++ @need_symbol('c_wchar') ++ def test_unicode_conversion(self): ++ b = create_unicode_buffer("abc") ++ self.assertEqual(len(b), 4) # trailing nul char ++ self.assertEqual(sizeof(b), 4 * sizeof(c_wchar)) ++ self.assertIs(type(b[0]), str) ++ self.assertEqual(b[0], "a") ++ self.assertEqual(b[:], "abc\0") ++ self.assertEqual(b[::], "abc\0") ++ self.assertEqual(b[::-1], "\0cba") ++ self.assertEqual(b[::2], "ac") ++ self.assertEqual(b[::5], "a") + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_bytes.py +--- a/Lib/ctypes/test/test_bytes.py ++++ b/Lib/ctypes/test/test_bytes.py +@@ -38,13 +38,13 @@ + self.assertEqual(x.a, "abc") + self.assertEqual(type(x.a), str) + +- if sys.platform == "win32": +- def test_BSTR(self): +- from _ctypes import _SimpleCData +- class BSTR(_SimpleCData): +- _type_ = "X" ++ @unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++ def test_BSTR(self): ++ from _ctypes import _SimpleCData ++ class BSTR(_SimpleCData): ++ _type_ = "X" + +- BSTR("abc") ++ BSTR("abc") + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_byteswap.py +--- a/Lib/ctypes/test/test_byteswap.py ++++ b/Lib/ctypes/test/test_byteswap.py +@@ -14,7 +14,8 @@ + # For Structures and Unions, these types are created on demand. + + class Test(unittest.TestCase): +- def X_test(self): ++ @unittest.skip('test disabled') ++ def test_X(self): + print(sys.byteorder, file=sys.stderr) + for i in range(32): + bits = BITS() +diff -r c0e311e010fc Lib/ctypes/test/test_callbacks.py +--- a/Lib/ctypes/test/test_callbacks.py ++++ b/Lib/ctypes/test/test_callbacks.py +@@ -1,5 +1,6 @@ + import unittest + from ctypes import * ++from ctypes.test import need_symbol + import _ctypes_test + + class Callbacks(unittest.TestCase): +@@ -88,9 +89,10 @@ + # disabled: would now (correctly) raise a RuntimeWarning about + # a memory leak. A callback function cannot return a non-integral + # C type without causing a memory leak. +-## def test_char_p(self): +-## self.check_type(c_char_p, "abc") +-## self.check_type(c_char_p, "def") ++ @unittest.skip('test disabled') ++ def test_char_p(self): ++ self.check_type(c_char_p, "abc") ++ self.check_type(c_char_p, "def") + + def test_pyobject(self): + o = () +@@ -142,13 +144,12 @@ + CFUNCTYPE(None)(lambda x=Nasty(): None) + + +-try: +- WINFUNCTYPE +-except NameError: +- pass +-else: +- class StdcallCallbacks(Callbacks): ++@need_symbol('WINFUNCTYPE') ++class StdcallCallbacks(Callbacks): ++ try: + functype = WINFUNCTYPE ++ except NameError: ++ pass + + ################################################################ + +@@ -178,7 +179,7 @@ + from ctypes.util import find_library + libc_path = find_library("c") + if not libc_path: +- return # cannot test ++ self.skipTest('could not find libc') + libc = CDLL(libc_path) + + @CFUNCTYPE(c_int, POINTER(c_int), POINTER(c_int)) +@@ -190,23 +191,19 @@ + libc.qsort(array, len(array), sizeof(c_int), cmp_func) + self.assertEqual(array[:], [1, 5, 7, 33, 99]) + +- try: +- WINFUNCTYPE +- except NameError: +- pass +- else: +- def test_issue_8959_b(self): +- from ctypes.wintypes import BOOL, HWND, LPARAM ++ @need_symbol('WINFUNCTYPE') ++ def test_issue_8959_b(self): ++ from ctypes.wintypes import BOOL, HWND, LPARAM ++ global windowCount ++ windowCount = 0 ++ ++ @WINFUNCTYPE(BOOL, HWND, LPARAM) ++ def EnumWindowsCallbackFunc(hwnd, lParam): + global windowCount +- windowCount = 0 ++ windowCount += 1 ++ return True #Allow windows to keep enumerating + +- @WINFUNCTYPE(BOOL, HWND, LPARAM) +- def EnumWindowsCallbackFunc(hwnd, lParam): +- global windowCount +- windowCount += 1 +- return True #Allow windows to keep enumerating +- +- windll.user32.EnumWindows(EnumWindowsCallbackFunc, 0) ++ windll.user32.EnumWindows(EnumWindowsCallbackFunc, 0) + + def test_callback_register_int(self): + # Issue #8275: buggy handling of callback args under Win64 +diff -r c0e311e010fc Lib/ctypes/test/test_cast.py +--- a/Lib/ctypes/test/test_cast.py ++++ b/Lib/ctypes/test/test_cast.py +@@ -1,4 +1,5 @@ + from ctypes import * ++from ctypes.test import need_symbol + import unittest + import sys + +@@ -75,15 +76,11 @@ + self.assertEqual(cast(cast(s, c_void_p), c_char_p).value, + b"hiho") + +- try: +- c_wchar_p +- except NameError: +- pass +- else: +- def test_wchar_p(self): +- s = c_wchar_p("hiho") +- self.assertEqual(cast(cast(s, c_void_p), c_wchar_p).value, +- "hiho") ++ @need_symbol('c_wchar_p') ++ def test_wchar_p(self): ++ s = c_wchar_p("hiho") ++ self.assertEqual(cast(cast(s, c_void_p), c_wchar_p).value, ++ "hiho") + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_cfuncs.py +--- a/Lib/ctypes/test/test_cfuncs.py ++++ b/Lib/ctypes/test/test_cfuncs.py +@@ -3,6 +3,7 @@ + + import unittest + from ctypes import * ++from ctypes.test import need_symbol + + import _ctypes_test + +@@ -193,7 +194,7 @@ + try: + WinDLL + except NameError: +- pass ++ def stdcall_dll(*_): pass + else: + class stdcall_dll(WinDLL): + def __getattr__(self, name): +@@ -203,9 +204,9 @@ + setattr(self, name, func) + return func + +- class stdcallCFunctions(CFunctions): +- _dll = stdcall_dll(_ctypes_test.__file__) +- pass ++@need_symbol('WinDLL') ++class stdcallCFunctions(CFunctions): ++ _dll = stdcall_dll(_ctypes_test.__file__) + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_checkretval.py +--- a/Lib/ctypes/test/test_checkretval.py ++++ b/Lib/ctypes/test/test_checkretval.py +@@ -1,6 +1,7 @@ + import unittest + + from ctypes import * ++from ctypes.test import need_symbol + + class CHECKED(c_int): + def _check_retval_(value): +@@ -25,15 +26,11 @@ + del dll._testfunc_p_p.restype + self.assertEqual(42, dll._testfunc_p_p(42)) + +- try: +- oledll +- except NameError: +- pass +- else: +- def test_oledll(self): +- self.assertRaises(OSError, +- oledll.oleaut32.CreateTypeLib2, +- 0, None, None) ++ @need_symbol('oledll') ++ def test_oledll(self): ++ self.assertRaises(OSError, ++ oledll.oleaut32.CreateTypeLib2, ++ 0, None, None) + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_errcheck.py +--- a/Lib/ctypes/test/test_errcheck.py ++++ /dev/null +@@ -1,19 +0,0 @@ +-import sys +-from ctypes import * +- +-##class HMODULE(Structure): +-## _fields_ = [("value", c_void_p)] +- +-## def __repr__(self): +-## return "" % self.value +- +-##windll.kernel32.GetModuleHandleA.restype = HMODULE +- +-##print windll.kernel32.GetModuleHandleA("python23.dll") +-##print hex(sys.dllhandle) +- +-##def nonzero(handle): +-## return (GetLastError(), handle) +- +-##windll.kernel32.GetModuleHandleA.errcheck = nonzero +-##print windll.kernel32.GetModuleHandleA("spam") +diff -r c0e311e010fc Lib/ctypes/test/test_find.py +--- a/Lib/ctypes/test/test_find.py ++++ b/Lib/ctypes/test/test_find.py +@@ -1,4 +1,5 @@ + import unittest ++import os + import sys + from ctypes import * + from ctypes.util import find_library +@@ -40,43 +41,43 @@ + except OSError: + pass + +- if lib_gl: +- def test_gl(self): +- if self.gl: +- self.gl.glClearIndex ++ @unittest.skipUnless(lib_gl, 'lib_gl not available') ++ def test_gl(self): ++ if self.gl: ++ self.gl.glClearIndex + +- if lib_glu: +- def test_glu(self): +- if self.glu: +- self.glu.gluBeginCurve ++ @unittest.skipUnless(lib_glu, 'lib_glu not available') ++ def test_glu(self): ++ if self.glu: ++ self.glu.gluBeginCurve + +- if lib_gle: +- def test_gle(self): +- if self.gle: +- self.gle.gleGetJoinStyle ++ @unittest.skipUnless(lib_gle, 'lib_gle not available') ++ def test_gle(self): ++ if self.gle: ++ self.gle.gleGetJoinStyle + +-##if os.name == "posix" and sys.platform != "darwin": +- +-## # On platforms where the default shared library suffix is '.so', +-## # at least some libraries can be loaded as attributes of the cdll +-## # object, since ctypes now tries loading the lib again +-## # with '.so' appended of the first try fails. +-## # +-## # Won't work for libc, unfortunately. OTOH, it isn't +-## # needed for libc since this is already mapped into the current +-## # process (?) +-## # +-## # On MAC OSX, it won't work either, because dlopen() needs a full path, +-## # and the default suffix is either none or '.dylib'. +- +-## class LoadLibs(unittest.TestCase): +-## def test_libm(self): +-## import math +-## libm = cdll.libm +-## sqrt = libm.sqrt +-## sqrt.argtypes = (c_double,) +-## sqrt.restype = c_double +-## self.assertEqual(sqrt(2), math.sqrt(2)) ++# On platforms where the default shared library suffix is '.so', ++# at least some libraries can be loaded as attributes of the cdll ++# object, since ctypes now tries loading the lib again ++# with '.so' appended of the first try fails. ++# ++# Won't work for libc, unfortunately. OTOH, it isn't ++# needed for libc since this is already mapped into the current ++# process (?) ++# ++# On MAC OSX, it won't work either, because dlopen() needs a full path, ++# and the default suffix is either none or '.dylib'. ++@unittest.skip('test disabled') ++@unittest.skipUnless(os.name=="posix" and sys.platform != "darwin", ++ 'test not suitable for this platform') ++class LoadLibs(unittest.TestCase): ++ def test_libm(self): ++ import math ++ libm = cdll.libm ++ sqrt = libm.sqrt ++ sqrt.argtypes = (c_double,) ++ sqrt.restype = c_double ++ self.assertEqual(sqrt(2), math.sqrt(2)) + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_functions.py +--- a/Lib/ctypes/test/test_functions.py ++++ b/Lib/ctypes/test/test_functions.py +@@ -6,6 +6,7 @@ + """ + + from ctypes import * ++from ctypes.test import need_symbol + import sys, unittest + + try: +@@ -63,22 +64,16 @@ + pass + + ++ @need_symbol('c_wchar') + def test_wchar_parm(self): +- try: +- c_wchar +- except NameError: +- return + f = dll._testfunc_i_bhilfd + f.argtypes = [c_byte, c_wchar, c_int, c_long, c_float, c_double] + result = f(1, "x", 3, 4, 5.0, 6.0) + self.assertEqual(result, 139) + self.assertEqual(type(result), int) + ++ @need_symbol('c_wchar') + def test_wchar_result(self): +- try: +- c_wchar +- except NameError: +- return + f = dll._testfunc_i_bhilfd + f.argtypes = [c_byte, c_short, c_int, c_long, c_float, c_double] + f.restype = c_wchar +@@ -155,11 +150,8 @@ + self.assertEqual(result, -21) + self.assertEqual(type(result), float) + ++ @need_symbol('c_longlong') + def test_longlongresult(self): +- try: +- c_longlong +- except NameError: +- return + f = dll._testfunc_q_bhilfd + f.restype = c_longlong + f.argtypes = [c_byte, c_short, c_int, c_long, c_float, c_double] +@@ -296,6 +288,7 @@ + result = f(-10, cb) + self.assertEqual(result, -18) + ++ @need_symbol('c_longlong') + def test_longlong_callbacks(self): + + f = dll._testfunc_callback_q_qf +@@ -348,16 +341,16 @@ + s2h = dll.ret_2h_func(inp) + self.assertEqual((s2h.x, s2h.y), (99*2, 88*3)) + +- if sys.platform == "win32": +- def test_struct_return_2H_stdcall(self): +- class S2H(Structure): +- _fields_ = [("x", c_short), +- ("y", c_short)] ++ @unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++ def test_struct_return_2H_stdcall(self): ++ class S2H(Structure): ++ _fields_ = [("x", c_short), ++ ("y", c_short)] + +- windll.s_ret_2h_func.restype = S2H +- windll.s_ret_2h_func.argtypes = [S2H] +- s2h = windll.s_ret_2h_func(S2H(99, 88)) +- self.assertEqual((s2h.x, s2h.y), (99*2, 88*3)) ++ windll.s_ret_2h_func.restype = S2H ++ windll.s_ret_2h_func.argtypes = [S2H] ++ s2h = windll.s_ret_2h_func(S2H(99, 88)) ++ self.assertEqual((s2h.x, s2h.y), (99*2, 88*3)) + + def test_struct_return_8H(self): + class S8I(Structure): +@@ -376,23 +369,24 @@ + self.assertEqual((s8i.a, s8i.b, s8i.c, s8i.d, s8i.e, s8i.f, s8i.g, s8i.h), + (9*2, 8*3, 7*4, 6*5, 5*6, 4*7, 3*8, 2*9)) + +- if sys.platform == "win32": +- def test_struct_return_8H_stdcall(self): +- class S8I(Structure): +- _fields_ = [("a", c_int), +- ("b", c_int), +- ("c", c_int), +- ("d", c_int), +- ("e", c_int), +- ("f", c_int), +- ("g", c_int), +- ("h", c_int)] +- windll.s_ret_8i_func.restype = S8I +- windll.s_ret_8i_func.argtypes = [S8I] +- inp = S8I(9, 8, 7, 6, 5, 4, 3, 2) +- s8i = windll.s_ret_8i_func(inp) +- self.assertEqual((s8i.a, s8i.b, s8i.c, s8i.d, s8i.e, s8i.f, s8i.g, s8i.h), +- (9*2, 8*3, 7*4, 6*5, 5*6, 4*7, 3*8, 2*9)) ++ @unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++ def test_struct_return_8H_stdcall(self): ++ class S8I(Structure): ++ _fields_ = [("a", c_int), ++ ("b", c_int), ++ ("c", c_int), ++ ("d", c_int), ++ ("e", c_int), ++ ("f", c_int), ++ ("g", c_int), ++ ("h", c_int)] ++ windll.s_ret_8i_func.restype = S8I ++ windll.s_ret_8i_func.argtypes = [S8I] ++ inp = S8I(9, 8, 7, 6, 5, 4, 3, 2) ++ s8i = windll.s_ret_8i_func(inp) ++ self.assertEqual( ++ (s8i.a, s8i.b, s8i.c, s8i.d, s8i.e, s8i.f, s8i.g, s8i.h), ++ (9*2, 8*3, 7*4, 6*5, 5*6, 4*7, 3*8, 2*9)) + + def test_sf1651235(self): + # see http://www.python.org/sf/1651235 +diff -r c0e311e010fc Lib/ctypes/test/test_integers.py +--- a/Lib/ctypes/test/test_integers.py ++++ /dev/null +@@ -1,5 +0,0 @@ +-# superseded by test_numbers.py +-import unittest +- +-if __name__ == '__main__': +- unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_keeprefs.py +--- a/Lib/ctypes/test/test_keeprefs.py ++++ b/Lib/ctypes/test/test_keeprefs.py +@@ -94,7 +94,8 @@ + self.assertEqual(x._objects, {'1': i}) + + class DeletePointerTestCase(unittest.TestCase): +- def X_test(self): ++ @unittest.skip('test disabled') ++ def test_X(self): + class X(Structure): + _fields_ = [("p", POINTER(c_char_p))] + x = X() +diff -r c0e311e010fc Lib/ctypes/test/test_loading.py +--- a/Lib/ctypes/test/test_loading.py ++++ b/Lib/ctypes/test/test_loading.py +@@ -21,18 +21,21 @@ + + unknowndll = "xxrandomnamexx" + +- if libc_name is not None: +- def test_load(self): +- CDLL(libc_name) +- CDLL(os.path.basename(libc_name)) +- self.assertRaises(OSError, CDLL, self.unknowndll) ++ @unittest.skipUnless(libc_name is not None, 'could not find libc') ++ def test_load(self): ++ CDLL(libc_name) ++ CDLL(os.path.basename(libc_name)) ++ self.assertRaises(OSError, CDLL, self.unknowndll) + +- if libc_name is not None and os.path.basename(libc_name) == "libc.so.6": +- def test_load_version(self): +- cdll.LoadLibrary("libc.so.6") +- # linux uses version, libc 9 should not exist +- self.assertRaises(OSError, cdll.LoadLibrary, "libc.so.9") +- self.assertRaises(OSError, cdll.LoadLibrary, self.unknowndll) ++ @unittest.skipUnless(libc_name is not None, 'could not find libc') ++ @unittest.skipUnless(libc_name is not None and ++ os.path.basename(libc_name) == "libc.so.6", ++ 'wrong libc path for test') ++ def test_load_version(self): ++ cdll.LoadLibrary("libc.so.6") ++ # linux uses version, libc 9 should not exist ++ self.assertRaises(OSError, cdll.LoadLibrary, "libc.so.9") ++ self.assertRaises(OSError, cdll.LoadLibrary, self.unknowndll) + + def test_find(self): + for name in ("c", "m"): +@@ -41,66 +44,71 @@ + cdll.LoadLibrary(lib) + CDLL(lib) + +- if os.name in ("nt", "ce"): +- def test_load_library(self): +- self.assertIsNotNone(libc_name) +- if is_resource_enabled("printing"): +- print(find_library("kernel32")) +- print(find_library("user32")) ++ @unittest.skipUnless(os.name in ("nt", "ce"), ++ 'test specific to Windows (NT/CE)') ++ def test_load_library(self): ++ self.assertIsNotNone(libc_name) ++ if is_resource_enabled("printing"): ++ print(find_library("kernel32")) ++ print(find_library("user32")) + +- if os.name == "nt": +- windll.kernel32.GetModuleHandleW +- windll["kernel32"].GetModuleHandleW +- windll.LoadLibrary("kernel32").GetModuleHandleW +- WinDLL("kernel32").GetModuleHandleW +- elif os.name == "ce": +- windll.coredll.GetModuleHandleW +- windll["coredll"].GetModuleHandleW +- windll.LoadLibrary("coredll").GetModuleHandleW +- WinDLL("coredll").GetModuleHandleW ++ if os.name == "nt": ++ windll.kernel32.GetModuleHandleW ++ windll["kernel32"].GetModuleHandleW ++ windll.LoadLibrary("kernel32").GetModuleHandleW ++ WinDLL("kernel32").GetModuleHandleW ++ elif os.name == "ce": ++ windll.coredll.GetModuleHandleW ++ windll["coredll"].GetModuleHandleW ++ windll.LoadLibrary("coredll").GetModuleHandleW ++ WinDLL("coredll").GetModuleHandleW + +- def test_load_ordinal_functions(self): +- import _ctypes_test +- dll = WinDLL(_ctypes_test.__file__) +- # We load the same function both via ordinal and name +- func_ord = dll[2] +- func_name = dll.GetString +- # addressof gets the address where the function pointer is stored +- a_ord = addressof(func_ord) +- a_name = addressof(func_name) +- f_ord_addr = c_void_p.from_address(a_ord).value +- f_name_addr = c_void_p.from_address(a_name).value +- self.assertEqual(hex(f_ord_addr), hex(f_name_addr)) ++ @unittest.skipUnless(os.name in ("nt", "ce"), ++ 'test specific to Windows (NT/CE)') ++ def test_load_ordinal_functions(self): ++ import _ctypes_test ++ dll = WinDLL(_ctypes_test.__file__) ++ # We load the same function both via ordinal and name ++ func_ord = dll[2] ++ func_name = dll.GetString ++ # addressof gets the address where the function pointer is stored ++ a_ord = addressof(func_ord) ++ a_name = addressof(func_name) ++ f_ord_addr = c_void_p.from_address(a_ord).value ++ f_name_addr = c_void_p.from_address(a_name).value ++ self.assertEqual(hex(f_ord_addr), hex(f_name_addr)) + +- self.assertRaises(AttributeError, dll.__getitem__, 1234) ++ self.assertRaises(AttributeError, dll.__getitem__, 1234) + +- if os.name == "nt": +- def test_1703286_A(self): +- from _ctypes import LoadLibrary, FreeLibrary +- # On winXP 64-bit, advapi32 loads at an address that does +- # NOT fit into a 32-bit integer. FreeLibrary must be able +- # to accept this address. ++ @unittest.skipUnless(os.name == "nt", 'Windows-specific test') ++ def test_1703286_A(self): ++ from _ctypes import LoadLibrary, FreeLibrary ++ # On winXP 64-bit, advapi32 loads at an address that does ++ # NOT fit into a 32-bit integer. FreeLibrary must be able ++ # to accept this address. + +- # These are tests for http://www.python.org/sf/1703286 +- handle = LoadLibrary("advapi32") +- FreeLibrary(handle) ++ # These are tests for http://www.python.org/sf/1703286 ++ handle = LoadLibrary("advapi32") ++ FreeLibrary(handle) + +- def test_1703286_B(self): +- # Since on winXP 64-bit advapi32 loads like described +- # above, the (arbitrarily selected) CloseEventLog function +- # also has a high address. 'call_function' should accept +- # addresses so large. +- from _ctypes import call_function +- advapi32 = windll.advapi32 +- # Calling CloseEventLog with a NULL argument should fail, +- # but the call should not segfault or so. +- self.assertEqual(0, advapi32.CloseEventLog(None)) +- windll.kernel32.GetProcAddress.argtypes = c_void_p, c_char_p +- windll.kernel32.GetProcAddress.restype = c_void_p +- proc = windll.kernel32.GetProcAddress(advapi32._handle, b"CloseEventLog") +- self.assertTrue(proc) +- # This is the real test: call the function via 'call_function' +- self.assertEqual(0, call_function(proc, (None,))) ++ @unittest.skipUnless(os.name == "nt", 'Windows-specific test') ++ def test_1703286_B(self): ++ # Since on winXP 64-bit advapi32 loads like described ++ # above, the (arbitrarily selected) CloseEventLog function ++ # also has a high address. 'call_function' should accept ++ # addresses so large. ++ from _ctypes import call_function ++ advapi32 = windll.advapi32 ++ # Calling CloseEventLog with a NULL argument should fail, ++ # but the call should not segfault or so. ++ self.assertEqual(0, advapi32.CloseEventLog(None)) ++ windll.kernel32.GetProcAddress.argtypes = c_void_p, c_char_p ++ windll.kernel32.GetProcAddress.restype = c_void_p ++ proc = windll.kernel32.GetProcAddress(advapi32._handle, ++ b"CloseEventLog") ++ self.assertTrue(proc) ++ # This is the real test: call the function via 'call_function' ++ self.assertEqual(0, call_function(proc, (None,))) + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_macholib.py +--- a/Lib/ctypes/test/test_macholib.py ++++ b/Lib/ctypes/test/test_macholib.py +@@ -43,21 +43,21 @@ + raise ValueError("%s not found" % (name,)) + + class MachOTest(unittest.TestCase): +- if sys.platform == "darwin": +- def test_find(self): ++ @unittest.skipUnless(sys.platform == "darwin", 'OSX-specific test') ++ def test_find(self): + +- self.assertEqual(find_lib('pthread'), +- '/usr/lib/libSystem.B.dylib') ++ self.assertEqual(find_lib('pthread'), ++ '/usr/lib/libSystem.B.dylib') + +- result = find_lib('z') +- # Issue #21093: dyld default search path includes $HOME/lib and +- # /usr/local/lib before /usr/lib, which caused test failures if +- # a local copy of libz exists in one of them. Now ignore the head +- # of the path. +- self.assertRegex(result, r".*/lib/libz\..*.*\.dylib") ++ result = find_lib('z') ++ # Issue #21093: dyld default search path includes $HOME/lib and ++ # /usr/local/lib before /usr/lib, which caused test failures if ++ # a local copy of libz exists in one of them. Now ignore the head ++ # of the path. ++ self.assertRegex(result, r".*/lib/libz\..*.*\.dylib") + +- self.assertEqual(find_lib('IOKit'), +- '/System/Library/Frameworks/IOKit.framework/Versions/A/IOKit') ++ self.assertEqual(find_lib('IOKit'), ++ '/System/Library/Frameworks/IOKit.framework/Versions/A/IOKit') + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_memfunctions.py +--- a/Lib/ctypes/test/test_memfunctions.py ++++ b/Lib/ctypes/test/test_memfunctions.py +@@ -2,17 +2,19 @@ + from test import support + import unittest + from ctypes import * ++from ctypes.test import need_symbol + + class MemFunctionsTest(unittest.TestCase): +-## def test_overflow(self): +-## # string_at and wstring_at must use the Python calling +-## # convention (which acquires the GIL and checks the Python +-## # error flag). Provoke an error and catch it; see also issue +-## # #3554: +-## self.assertRaises((OverflowError, MemoryError, SystemError), +-## lambda: wstring_at(u"foo", sys.maxint - 1)) +-## self.assertRaises((OverflowError, MemoryError, SystemError), +-## lambda: string_at("foo", sys.maxint - 1)) ++ @unittest.skip('test disabled') ++ def test_overflow(self): ++ # string_at and wstring_at must use the Python calling ++ # convention (which acquires the GIL and checks the Python ++ # error flag). Provoke an error and catch it; see also issue ++ # #3554: ++ self.assertRaises((OverflowError, MemoryError, SystemError), ++ lambda: wstring_at(u"foo", sys.maxint - 1)) ++ self.assertRaises((OverflowError, MemoryError, SystemError), ++ lambda: string_at("foo", sys.maxint - 1)) + + def test_memmove(self): + # large buffers apparently increase the chance that the memory +@@ -61,21 +63,17 @@ + self.assertEqual(string_at(b"foo bar", 7), b"foo bar") + self.assertEqual(string_at(b"foo bar", 3), b"foo") + +- try: +- create_unicode_buffer +- except NameError: +- pass +- else: +- def test_wstring_at(self): +- p = create_unicode_buffer("Hello, World") +- a = create_unicode_buffer(1000000) +- result = memmove(a, p, len(p) * sizeof(c_wchar)) +- self.assertEqual(a.value, "Hello, World") ++ @need_symbol('create_unicode_buffer') ++ def test_wstring_at(self): ++ p = create_unicode_buffer("Hello, World") ++ a = create_unicode_buffer(1000000) ++ result = memmove(a, p, len(p) * sizeof(c_wchar)) ++ self.assertEqual(a.value, "Hello, World") + +- self.assertEqual(wstring_at(a), "Hello, World") +- self.assertEqual(wstring_at(a, 5), "Hello") +- self.assertEqual(wstring_at(a, 16), "Hello, World\0\0\0\0") +- self.assertEqual(wstring_at(a, 0), "") ++ self.assertEqual(wstring_at(a), "Hello, World") ++ self.assertEqual(wstring_at(a, 5), "Hello") ++ self.assertEqual(wstring_at(a, 16), "Hello, World\0\0\0\0") ++ self.assertEqual(wstring_at(a, 0), "") + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_numbers.py +--- a/Lib/ctypes/test/test_numbers.py ++++ b/Lib/ctypes/test/test_numbers.py +@@ -82,12 +82,13 @@ + self.assertRaises(TypeError, t, "") + self.assertRaises(TypeError, t, None) + +-## def test_valid_ranges(self): +-## # invalid values of the correct type +-## # raise ValueError (not OverflowError) +-## for t, (l, h) in zip(unsigned_types, unsigned_ranges): +-## self.assertRaises(ValueError, t, l-1) +-## self.assertRaises(ValueError, t, h+1) ++ @unittest.skip('test disabled') ++ def test_valid_ranges(self): ++ # invalid values of the correct type ++ # raise ValueError (not OverflowError) ++ for t, (l, h) in zip(unsigned_types, unsigned_ranges): ++ self.assertRaises(ValueError, t, l-1) ++ self.assertRaises(ValueError, t, h+1) + + def test_from_param(self): + # the from_param class method attribute always +@@ -200,16 +201,17 @@ + self.assertEqual(v.value, b'?') + + # array does not support c_bool / 't' +- # def test_bool_from_address(self): +- # from ctypes import c_bool +- # from array import array +- # a = array(c_bool._type_, [True]) +- # v = t.from_address(a.buffer_info()[0]) +- # self.assertEqual(v.value, a[0]) +- # self.assertEqual(type(v) is t) +- # a[0] = False +- # self.assertEqual(v.value, a[0]) +- # self.assertEqual(type(v) is t) ++ @unittest.skip('test disabled') ++ def test_bool_from_address(self): ++ from ctypes import c_bool ++ from array import array ++ a = array(c_bool._type_, [True]) ++ v = t.from_address(a.buffer_info()[0]) ++ self.assertEqual(v.value, a[0]) ++ self.assertEqual(type(v) is t) ++ a[0] = False ++ self.assertEqual(v.value, a[0]) ++ self.assertEqual(type(v) is t) + + def test_init(self): + # c_int() can be initialized from Python's int, and c_int. +@@ -227,8 +229,9 @@ + if (hasattr(t, "__ctype_le__")): + self.assertRaises(OverflowError, t.__ctype_le__, big_int) + +-## def test_perf(self): +-## check_perf() ++ @unittest.skip('test disabled') ++ def test_perf(self): ++ check_perf() + + from ctypes import _SimpleCData + class c_int_S(_SimpleCData): +diff -r c0e311e010fc Lib/ctypes/test/test_objects.py +--- a/Lib/ctypes/test/test_objects.py ++++ b/Lib/ctypes/test/test_objects.py +@@ -59,12 +59,9 @@ + import ctypes.test.test_objects + + class TestCase(unittest.TestCase): +- if sys.hexversion > 0x02040000: +- # Python 2.3 has no ELLIPSIS flag, so we don't test with this +- # version: +- def test(self): +- doctest.testmod(ctypes.test.test_objects) ++ def test(self): ++ failures, tests = doctest.testmod(ctypes.test.test_objects) ++ self.assertFalse(failures, 'doctests failed, see output above') + + if __name__ == '__main__': +- if sys.hexversion > 0x02040000: +- doctest.testmod(ctypes.test.test_objects) ++ doctest.testmod(ctypes.test.test_objects) +diff -r c0e311e010fc Lib/ctypes/test/test_parameters.py +--- a/Lib/ctypes/test/test_parameters.py ++++ b/Lib/ctypes/test/test_parameters.py +@@ -1,4 +1,5 @@ + import unittest, sys ++from ctypes.test import need_symbol + + class SimpleTypesTestCase(unittest.TestCase): + +@@ -35,10 +36,9 @@ + self.assertEqual(CVOIDP.from_param("abc"), "abcabc") + self.assertEqual(CCHARP.from_param("abc"), "abcabcabcabc") + +- try: +- from ctypes import c_wchar_p +- except ImportError: +- return ++ @need_symbol('c_wchar_p') ++ def test_subclasses_c_wchar_p(self): ++ from ctypes import c_wchar_p + + class CWCHARP(c_wchar_p): + def from_param(cls, value): +@@ -66,13 +66,9 @@ + a = c_char_p(b"123") + self.assertIs(c_char_p.from_param(a), a) + ++ @need_symbol('c_wchar_p') + def test_cw_strings(self): +- from ctypes import byref +- try: +- from ctypes import c_wchar_p +- except ImportError: +-## print "(No c_wchar_p)" +- return ++ from ctypes import byref, c_wchar_p + + c_wchar_p.from_param("123") + +@@ -139,9 +135,6 @@ + self.assertRaises(TypeError, LPINT.from_param, c_long*3) + self.assertRaises(TypeError, LPINT.from_param, c_uint*3) + +-## def test_performance(self): +-## check_perf() +- + def test_noctypes_argtype(self): + import _ctypes_test + from ctypes import CDLL, c_void_p, ArgumentError +diff -r c0e311e010fc Lib/ctypes/test/test_prototypes.py +--- a/Lib/ctypes/test/test_prototypes.py ++++ b/Lib/ctypes/test/test_prototypes.py +@@ -1,4 +1,5 @@ + from ctypes import * ++from ctypes.test import need_symbol + import unittest + + # IMPORTANT INFO: +@@ -135,13 +136,14 @@ + func(pointer(c_int())) + func((c_int * 3)()) + +- try: +- func.restype = c_wchar_p +- except NameError: +- pass +- else: +- self.assertEqual(None, func(c_wchar_p(None))) +- self.assertEqual("123", func(c_wchar_p("123"))) ++ @need_symbol('c_wchar_p') ++ def test_c_void_p_arg_with_c_wchar_p(self): ++ func = testdll._testfunc_p_p ++ func.restype = c_wchar_p ++ func.argtypes = c_void_p, ++ ++ self.assertEqual(None, func(c_wchar_p(None))) ++ self.assertEqual("123", func(c_wchar_p("123"))) + + def test_instance(self): + func = testdll._testfunc_p_p +@@ -156,51 +158,47 @@ + func.argtypes = None + self.assertEqual(None, func(X())) + +-try: +- c_wchar +-except NameError: +- pass +-else: +- class WCharPointersTestCase(unittest.TestCase): ++@need_symbol('c_wchar') ++class WCharPointersTestCase(unittest.TestCase): + +- def setUp(self): +- func = testdll._testfunc_p_p +- func.restype = c_int +- func.argtypes = None ++ def setUp(self): ++ func = testdll._testfunc_p_p ++ func.restype = c_int ++ func.argtypes = None + + +- def test_POINTER_c_wchar_arg(self): +- func = testdll._testfunc_p_p +- func.restype = c_wchar_p +- func.argtypes = POINTER(c_wchar), ++ def test_POINTER_c_wchar_arg(self): ++ func = testdll._testfunc_p_p ++ func.restype = c_wchar_p ++ func.argtypes = POINTER(c_wchar), + +- self.assertEqual(None, func(None)) +- self.assertEqual("123", func("123")) +- self.assertEqual(None, func(c_wchar_p(None))) +- self.assertEqual("123", func(c_wchar_p("123"))) ++ self.assertEqual(None, func(None)) ++ self.assertEqual("123", func("123")) ++ self.assertEqual(None, func(c_wchar_p(None))) ++ self.assertEqual("123", func(c_wchar_p("123"))) + +- self.assertEqual("123", func(c_wbuffer("123"))) +- ca = c_wchar("a") +- self.assertEqual("a", func(pointer(ca))[0]) +- self.assertEqual("a", func(byref(ca))[0]) ++ self.assertEqual("123", func(c_wbuffer("123"))) ++ ca = c_wchar("a") ++ self.assertEqual("a", func(pointer(ca))[0]) ++ self.assertEqual("a", func(byref(ca))[0]) + +- def test_c_wchar_p_arg(self): +- func = testdll._testfunc_p_p +- func.restype = c_wchar_p +- func.argtypes = c_wchar_p, ++ def test_c_wchar_p_arg(self): ++ func = testdll._testfunc_p_p ++ func.restype = c_wchar_p ++ func.argtypes = c_wchar_p, + +- c_wchar_p.from_param("123") ++ c_wchar_p.from_param("123") + +- self.assertEqual(None, func(None)) +- self.assertEqual("123", func("123")) +- self.assertEqual(None, func(c_wchar_p(None))) +- self.assertEqual("123", func(c_wchar_p("123"))) ++ self.assertEqual(None, func(None)) ++ self.assertEqual("123", func("123")) ++ self.assertEqual(None, func(c_wchar_p(None))) ++ self.assertEqual("123", func(c_wchar_p("123"))) + +- # XXX Currently, these raise TypeErrors, although they shouldn't: +- self.assertEqual("123", func(c_wbuffer("123"))) +- ca = c_wchar("a") +- self.assertEqual("a", func(pointer(ca))[0]) +- self.assertEqual("a", func(byref(ca))[0]) ++ # XXX Currently, these raise TypeErrors, although they shouldn't: ++ self.assertEqual("123", func(c_wbuffer("123"))) ++ ca = c_wchar("a") ++ self.assertEqual("a", func(pointer(ca))[0]) ++ self.assertEqual("a", func(byref(ca))[0]) + + class ArrayTest(unittest.TestCase): + def test(self): +diff -r c0e311e010fc Lib/ctypes/test/test_python_api.py +--- a/Lib/ctypes/test/test_python_api.py ++++ b/Lib/ctypes/test/test_python_api.py +@@ -1,7 +1,7 @@ + from ctypes import * + import unittest, sys + from test import support +-from ctypes.test import is_resource_enabled ++from ctypes.test import requires + + ################################################################ + # This section should be moved into ctypes\__init__.py, when it's ready. +@@ -39,24 +39,25 @@ + del pyob + self.assertEqual(grc(s), refcnt) + +- if is_resource_enabled("refcount"): +- # This test is unreliable, because it is possible that code in +- # unittest changes the refcount of the '42' integer. So, it +- # is disabled by default. +- def test_PyLong_Long(self): +- ref42 = grc(42) +- pythonapi.PyLong_FromLong.restype = py_object +- self.assertEqual(pythonapi.PyLong_FromLong(42), 42) ++ # This test is unreliable, because it is possible that code in ++ # unittest changes the refcount of the '42' integer. So, it ++ # is disabled by default. ++ @support.refcount_test ++ def test_PyLong_Long(self): ++ requires("refcount") ++ ref42 = grc(42) ++ pythonapi.PyLong_FromLong.restype = py_object ++ self.assertEqual(pythonapi.PyLong_FromLong(42), 42) + +- self.assertEqual(grc(42), ref42) ++ self.assertEqual(grc(42), ref42) + +- pythonapi.PyLong_AsLong.argtypes = (py_object,) +- pythonapi.PyLong_AsLong.restype = c_long ++ pythonapi.PyLong_AsLong.argtypes = (py_object,) ++ pythonapi.PyLong_AsLong.restype = c_long + +- res = pythonapi.PyLong_AsLong(42) +- self.assertEqual(grc(res), ref42 + 1) +- del res +- self.assertEqual(grc(42), ref42) ++ res = pythonapi.PyLong_AsLong(42) ++ self.assertEqual(grc(res), ref42 + 1) ++ del res ++ self.assertEqual(grc(42), ref42) + + @support.refcount_test + def test_PyObj_FromPtr(self): +diff -r c0e311e010fc Lib/ctypes/test/test_random_things.py +--- a/Lib/ctypes/test/test_random_things.py ++++ b/Lib/ctypes/test/test_random_things.py +@@ -5,23 +5,22 @@ + 42 / arg + raise ValueError(arg) + +-if sys.platform == "win32": ++@unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++class call_function_TestCase(unittest.TestCase): ++ # _ctypes.call_function is deprecated and private, but used by ++ # Gary Bishp's readline module. If we have it, we must test it as well. + +- class call_function_TestCase(unittest.TestCase): +- # _ctypes.call_function is deprecated and private, but used by +- # Gary Bishp's readline module. If we have it, we must test it as well. ++ def test(self): ++ from _ctypes import call_function ++ windll.kernel32.LoadLibraryA.restype = c_void_p ++ windll.kernel32.GetProcAddress.argtypes = c_void_p, c_char_p ++ windll.kernel32.GetProcAddress.restype = c_void_p + +- def test(self): +- from _ctypes import call_function +- windll.kernel32.LoadLibraryA.restype = c_void_p +- windll.kernel32.GetProcAddress.argtypes = c_void_p, c_char_p +- windll.kernel32.GetProcAddress.restype = c_void_p ++ hdll = windll.kernel32.LoadLibraryA(b"kernel32") ++ funcaddr = windll.kernel32.GetProcAddress(hdll, b"GetModuleHandleA") + +- hdll = windll.kernel32.LoadLibraryA(b"kernel32") +- funcaddr = windll.kernel32.GetProcAddress(hdll, b"GetModuleHandleA") +- +- self.assertEqual(call_function(funcaddr, (None,)), +- windll.kernel32.GetModuleHandleA(None)) ++ self.assertEqual(call_function(funcaddr, (None,)), ++ windll.kernel32.GetModuleHandleA(None)) + + class CallbackTracbackTestCase(unittest.TestCase): + # When an exception is raised in a ctypes callback function, the C +diff -r c0e311e010fc Lib/ctypes/test/test_slicing.py +--- a/Lib/ctypes/test/test_slicing.py ++++ b/Lib/ctypes/test/test_slicing.py +@@ -1,5 +1,6 @@ + import unittest + from ctypes import * ++from ctypes.test import need_symbol + + import _ctypes_test + +@@ -125,44 +126,40 @@ + self.assertEqual(p[2:5:-3], s[2:5:-3]) + + +- try: +- c_wchar +- except NameError: +- pass +- else: +- def test_wchar_ptr(self): +- s = "abcdefghijklmnopqrstuvwxyz\0" ++ @need_symbol('c_wchar') ++ def test_wchar_ptr(self): ++ s = "abcdefghijklmnopqrstuvwxyz\0" + +- dll = CDLL(_ctypes_test.__file__) +- dll.my_wcsdup.restype = POINTER(c_wchar) +- dll.my_wcsdup.argtypes = POINTER(c_wchar), +- dll.my_free.restype = None +- res = dll.my_wcsdup(s) +- self.assertEqual(res[:len(s)], s) +- self.assertEqual(res[:len(s):], s) +- self.assertEqual(res[len(s)-1:-1:-1], s[::-1]) +- self.assertEqual(res[len(s)-1:5:-7], s[:5:-7]) ++ dll = CDLL(_ctypes_test.__file__) ++ dll.my_wcsdup.restype = POINTER(c_wchar) ++ dll.my_wcsdup.argtypes = POINTER(c_wchar), ++ dll.my_free.restype = None ++ res = dll.my_wcsdup(s) ++ self.assertEqual(res[:len(s)], s) ++ self.assertEqual(res[:len(s):], s) ++ self.assertEqual(res[len(s)-1:-1:-1], s[::-1]) ++ self.assertEqual(res[len(s)-1:5:-7], s[:5:-7]) + +- import operator +- self.assertRaises(TypeError, operator.setitem, +- res, slice(0, 5), "abcde") +- dll.my_free(res) ++ import operator ++ self.assertRaises(TypeError, operator.setitem, ++ res, slice(0, 5), "abcde") ++ dll.my_free(res) + +- if sizeof(c_wchar) == sizeof(c_short): +- dll.my_wcsdup.restype = POINTER(c_short) +- elif sizeof(c_wchar) == sizeof(c_int): +- dll.my_wcsdup.restype = POINTER(c_int) +- elif sizeof(c_wchar) == sizeof(c_long): +- dll.my_wcsdup.restype = POINTER(c_long) +- else: +- return +- res = dll.my_wcsdup(s) +- tmpl = list(range(ord("a"), ord("z")+1)) +- self.assertEqual(res[:len(s)-1], tmpl) +- self.assertEqual(res[:len(s)-1:], tmpl) +- self.assertEqual(res[len(s)-2:-1:-1], tmpl[::-1]) +- self.assertEqual(res[len(s)-2:5:-7], tmpl[:5:-7]) +- dll.my_free(res) ++ if sizeof(c_wchar) == sizeof(c_short): ++ dll.my_wcsdup.restype = POINTER(c_short) ++ elif sizeof(c_wchar) == sizeof(c_int): ++ dll.my_wcsdup.restype = POINTER(c_int) ++ elif sizeof(c_wchar) == sizeof(c_long): ++ dll.my_wcsdup.restype = POINTER(c_long) ++ else: ++ self.skipTest('Pointers to c_wchar are not supported') ++ res = dll.my_wcsdup(s) ++ tmpl = list(range(ord("a"), ord("z")+1)) ++ self.assertEqual(res[:len(s)-1], tmpl) ++ self.assertEqual(res[:len(s)-1:], tmpl) ++ self.assertEqual(res[len(s)-2:-1:-1], tmpl[::-1]) ++ self.assertEqual(res[len(s)-2:5:-7], tmpl[:5:-7]) ++ dll.my_free(res) + + ################################################################ + +diff -r c0e311e010fc Lib/ctypes/test/test_strings.py +--- a/Lib/ctypes/test/test_strings.py ++++ b/Lib/ctypes/test/test_strings.py +@@ -1,5 +1,6 @@ + import unittest + from ctypes import * ++from ctypes.test import need_symbol + + class StringArrayTestCase(unittest.TestCase): + def test(self): +@@ -53,36 +54,33 @@ + ## print BUF.from_param(c_char_p("python")) + ## print BUF.from_param(BUF(*"pyth")) + +-try: +- c_wchar +-except NameError: +- pass +-else: +- class WStringArrayTestCase(unittest.TestCase): +- def test(self): +- BUF = c_wchar * 4 ++@need_symbol('c_wchar') ++class WStringArrayTestCase(unittest.TestCase): ++ def test(self): ++ BUF = c_wchar * 4 + +- buf = BUF("a", "b", "c") +- self.assertEqual(buf.value, "abc") ++ buf = BUF("a", "b", "c") ++ self.assertEqual(buf.value, "abc") + +- buf.value = "ABCD" +- self.assertEqual(buf.value, "ABCD") ++ buf.value = "ABCD" ++ self.assertEqual(buf.value, "ABCD") + +- buf.value = "x" +- self.assertEqual(buf.value, "x") ++ buf.value = "x" ++ self.assertEqual(buf.value, "x") + +- buf[1] = "Z" +- self.assertEqual(buf.value, "xZCD") ++ buf[1] = "Z" ++ self.assertEqual(buf.value, "xZCD") + +- @unittest.skipIf(sizeof(c_wchar) < 4, +- "sizeof(wchar_t) is smaller than 4 bytes") +- def test_nonbmp(self): +- u = chr(0x10ffff) +- w = c_wchar(u) +- self.assertEqual(w.value, u) ++ @unittest.skipIf(sizeof(c_wchar) < 4, ++ "sizeof(wchar_t) is smaller than 4 bytes") ++ def test_nonbmp(self): ++ u = chr(0x10ffff) ++ w = c_wchar(u) ++ self.assertEqual(w.value, u) + + class StringTestCase(unittest.TestCase): +- def XX_test_basic_strings(self): ++ @unittest.skip('test disabled') ++ def test_basic_strings(self): + cs = c_string("abcdef") + + # Cannot call len on a c_string any longer +@@ -108,7 +106,8 @@ + + self.assertRaises(TypeError, c_string, "123") + +- def XX_test_sized_strings(self): ++ @unittest.skip('test disabled') ++ def test_sized_strings(self): + + # New in releases later than 0.4.0: + self.assertRaises(TypeError, c_string, None) +@@ -125,7 +124,8 @@ + self.assertEqual(c_string(2).raw[-1], "\000") + self.assertEqual(len(c_string(2).raw), 2) + +- def XX_test_initialized_strings(self): ++ @unittest.skip('test disabled') ++ def test_initialized_strings(self): + + self.assertEqual(c_string("ab", 4).raw[:2], "ab") + self.assertEqual(c_string("ab", 4).raw[:2:], "ab") +@@ -134,7 +134,8 @@ + self.assertEqual(c_string("ab", 4).raw[-1], "\000") + self.assertEqual(c_string("ab", 2).raw, "a\000") + +- def XX_test_toolong(self): ++ @unittest.skip('test disabled') ++ def test_toolong(self): + cs = c_string("abcdef") + # Much too long string: + self.assertRaises(ValueError, setattr, cs, "value", "123456789012345") +@@ -142,54 +143,53 @@ + # One char too long values: + self.assertRaises(ValueError, setattr, cs, "value", "1234567") + +-## def test_perf(self): +-## check_perf() ++ @unittest.skip('test disabled') ++ def test_perf(self): ++ check_perf() + +-try: +- c_wchar +-except NameError: +- pass +-else: +- class WStringTestCase(unittest.TestCase): +- def test_wchar(self): +- c_wchar("x") +- repr(byref(c_wchar("x"))) +- c_wchar("x") ++@need_symbol('c_wchar') ++class WStringTestCase(unittest.TestCase): ++ def test_wchar(self): ++ c_wchar("x") ++ repr(byref(c_wchar("x"))) ++ c_wchar("x") + + +- def X_test_basic_wstrings(self): +- cs = c_wstring("abcdef") ++ @unittest.skip('test disabled') ++ def test_basic_wstrings(self): ++ cs = c_wstring("abcdef") + +- # XXX This behaviour is about to change: +- # len returns the size of the internal buffer in bytes. +- # This includes the terminating NUL character. +- self.assertEqual(sizeof(cs), 14) ++ # XXX This behaviour is about to change: ++ # len returns the size of the internal buffer in bytes. ++ # This includes the terminating NUL character. ++ self.assertEqual(sizeof(cs), 14) + +- # The value property is the string up to the first terminating NUL. +- self.assertEqual(cs.value, "abcdef") +- self.assertEqual(c_wstring("abc\000def").value, "abc") ++ # The value property is the string up to the first terminating NUL. ++ self.assertEqual(cs.value, "abcdef") ++ self.assertEqual(c_wstring("abc\000def").value, "abc") + +- self.assertEqual(c_wstring("abc\000def").value, "abc") ++ self.assertEqual(c_wstring("abc\000def").value, "abc") + +- # The raw property is the total buffer contents: +- self.assertEqual(cs.raw, "abcdef\000") +- self.assertEqual(c_wstring("abc\000def").raw, "abc\000def\000") ++ # The raw property is the total buffer contents: ++ self.assertEqual(cs.raw, "abcdef\000") ++ self.assertEqual(c_wstring("abc\000def").raw, "abc\000def\000") + +- # We can change the value: +- cs.value = "ab" +- self.assertEqual(cs.value, "ab") +- self.assertEqual(cs.raw, "ab\000\000\000\000\000") ++ # We can change the value: ++ cs.value = "ab" ++ self.assertEqual(cs.value, "ab") ++ self.assertEqual(cs.raw, "ab\000\000\000\000\000") + +- self.assertRaises(TypeError, c_wstring, "123") +- self.assertRaises(ValueError, c_wstring, 0) ++ self.assertRaises(TypeError, c_wstring, "123") ++ self.assertRaises(ValueError, c_wstring, 0) + +- def X_test_toolong(self): +- cs = c_wstring("abcdef") +- # Much too long string: +- self.assertRaises(ValueError, setattr, cs, "value", "123456789012345") ++ @unittest.skip('test disabled') ++ def test_toolong(self): ++ cs = c_wstring("abcdef") ++ # Much too long string: ++ self.assertRaises(ValueError, setattr, cs, "value", "123456789012345") + +- # One char too long values: +- self.assertRaises(ValueError, setattr, cs, "value", "1234567") ++ # One char too long values: ++ self.assertRaises(ValueError, setattr, cs, "value", "1234567") + + + def run_test(rep, msg, func, arg): +diff -r c0e311e010fc Lib/ctypes/test/test_structures.py +--- a/Lib/ctypes/test/test_structures.py ++++ b/Lib/ctypes/test/test_structures.py +@@ -1,5 +1,6 @@ + import unittest + from ctypes import * ++from ctypes.test import need_symbol + from struct import calcsize + import _testcapi + +@@ -291,12 +292,8 @@ + self.assertEqual(p.phone.number, b"5678") + self.assertEqual(p.age, 5) + ++ @need_symbol('c_wchar') + def test_structures_with_wchar(self): +- try: +- c_wchar +- except NameError: +- return # no unicode +- + class PersonW(Structure): + _fields_ = [("name", c_wchar * 12), + ("age", c_int)] +@@ -354,14 +351,14 @@ + except Exception as detail: + return detail.__class__, str(detail) + +- +-## def test_subclass_creation(self): +-## meta = type(Structure) +-## # same as 'class X(Structure): pass' +-## # fails, since we need either a _fields_ or a _abstract_ attribute +-## cls, msg = self.get_except(meta, "X", (Structure,), {}) +-## self.assertEqual((cls, msg), +-## (AttributeError, "class must define a '_fields_' attribute")) ++ @unittest.skip('test disabled') ++ def test_subclass_creation(self): ++ meta = type(Structure) ++ # same as 'class X(Structure): pass' ++ # fails, since we need either a _fields_ or a _abstract_ attribute ++ cls, msg = self.get_except(meta, "X", (Structure,), {}) ++ self.assertEqual((cls, msg), ++ (AttributeError, "class must define a '_fields_' attribute")) + + def test_abstract_class(self): + class X(Structure): +diff -r c0e311e010fc Lib/ctypes/test/test_unicode.py +--- a/Lib/ctypes/test/test_unicode.py ++++ b/Lib/ctypes/test/test_unicode.py +@@ -1,58 +1,55 @@ + import unittest + import ctypes ++from ctypes.test import need_symbol + +-try: +- ctypes.c_wchar +-except AttributeError: +- pass +-else: +- import _ctypes_test ++import _ctypes_test + +- class UnicodeTestCase(unittest.TestCase): +- def test_wcslen(self): +- dll = ctypes.CDLL(_ctypes_test.__file__) +- wcslen = dll.my_wcslen +- wcslen.argtypes = [ctypes.c_wchar_p] ++@need_symbol('c_wchar') ++class UnicodeTestCase(unittest.TestCase): ++ def test_wcslen(self): ++ dll = ctypes.CDLL(_ctypes_test.__file__) ++ wcslen = dll.my_wcslen ++ wcslen.argtypes = [ctypes.c_wchar_p] + +- self.assertEqual(wcslen("abc"), 3) +- self.assertEqual(wcslen("ab\u2070"), 3) +- self.assertRaises(ctypes.ArgumentError, wcslen, b"ab\xe4") ++ self.assertEqual(wcslen("abc"), 3) ++ self.assertEqual(wcslen("ab\u2070"), 3) ++ self.assertRaises(ctypes.ArgumentError, wcslen, b"ab\xe4") + +- def test_buffers(self): +- buf = ctypes.create_unicode_buffer("abc") +- self.assertEqual(len(buf), 3+1) ++ def test_buffers(self): ++ buf = ctypes.create_unicode_buffer("abc") ++ self.assertEqual(len(buf), 3+1) + +- buf = ctypes.create_unicode_buffer("ab\xe4\xf6\xfc") +- self.assertEqual(buf[:], "ab\xe4\xf6\xfc\0") +- self.assertEqual(buf[::], "ab\xe4\xf6\xfc\0") +- self.assertEqual(buf[::-1], '\x00\xfc\xf6\xe4ba') +- self.assertEqual(buf[::2], 'a\xe4\xfc') +- self.assertEqual(buf[6:5:-1], "") ++ buf = ctypes.create_unicode_buffer("ab\xe4\xf6\xfc") ++ self.assertEqual(buf[:], "ab\xe4\xf6\xfc\0") ++ self.assertEqual(buf[::], "ab\xe4\xf6\xfc\0") ++ self.assertEqual(buf[::-1], '\x00\xfc\xf6\xe4ba') ++ self.assertEqual(buf[::2], 'a\xe4\xfc') ++ self.assertEqual(buf[6:5:-1], "") + +- func = ctypes.CDLL(_ctypes_test.__file__)._testfunc_p_p ++func = ctypes.CDLL(_ctypes_test.__file__)._testfunc_p_p + +- class StringTestCase(UnicodeTestCase): +- def setUp(self): +- func.argtypes = [ctypes.c_char_p] +- func.restype = ctypes.c_char_p ++class StringTestCase(UnicodeTestCase): ++ def setUp(self): ++ func.argtypes = [ctypes.c_char_p] ++ func.restype = ctypes.c_char_p + +- def tearDown(self): +- func.argtypes = None +- func.restype = ctypes.c_int ++ def tearDown(self): ++ func.argtypes = None ++ func.restype = ctypes.c_int + +- def test_func(self): +- self.assertEqual(func(b"abc\xe4"), b"abc\xe4") ++ def test_func(self): ++ self.assertEqual(func(b"abc\xe4"), b"abc\xe4") + +- def test_buffers(self): +- buf = ctypes.create_string_buffer(b"abc") +- self.assertEqual(len(buf), 3+1) ++ def test_buffers(self): ++ buf = ctypes.create_string_buffer(b"abc") ++ self.assertEqual(len(buf), 3+1) + +- buf = ctypes.create_string_buffer(b"ab\xe4\xf6\xfc") +- self.assertEqual(buf[:], b"ab\xe4\xf6\xfc\0") +- self.assertEqual(buf[::], b"ab\xe4\xf6\xfc\0") +- self.assertEqual(buf[::-1], b'\x00\xfc\xf6\xe4ba') +- self.assertEqual(buf[::2], b'a\xe4\xfc') +- self.assertEqual(buf[6:5:-1], b"") ++ buf = ctypes.create_string_buffer(b"ab\xe4\xf6\xfc") ++ self.assertEqual(buf[:], b"ab\xe4\xf6\xfc\0") ++ self.assertEqual(buf[::], b"ab\xe4\xf6\xfc\0") ++ self.assertEqual(buf[::-1], b'\x00\xfc\xf6\xe4ba') ++ self.assertEqual(buf[::2], b'a\xe4\xfc') ++ self.assertEqual(buf[6:5:-1], b"") + + + if __name__ == '__main__': +diff -r c0e311e010fc Lib/ctypes/test/test_values.py +--- a/Lib/ctypes/test/test_values.py ++++ b/Lib/ctypes/test/test_values.py +@@ -3,6 +3,7 @@ + """ + + import unittest ++import sys + from ctypes import * + + import _ctypes_test +@@ -27,62 +28,68 @@ + ctdll = CDLL(_ctypes_test.__file__) + self.assertRaises(ValueError, c_int.in_dll, ctdll, "Undefined_Symbol") + +- class Win_ValuesTestCase(unittest.TestCase): +- """This test only works when python itself is a dll/shared library""" ++@unittest.skipUnless(sys.platform == 'win32', 'Windows-specific test') ++class Win_ValuesTestCase(unittest.TestCase): ++ """This test only works when python itself is a dll/shared library""" + +- def test_optimizeflag(self): +- # This test accesses the Py_OptimizeFlag intger, which is +- # exported by the Python dll. ++ def test_optimizeflag(self): ++ # This test accesses the Py_OptimizeFlag integer, which is ++ # exported by the Python dll and should match the sys.flags value + +- # It's value is set depending on the -O and -OO flags: +- # if not given, it is 0 and __debug__ is 1. +- # If -O is given, the flag is 1, for -OO it is 2. +- # docstrings are also removed in the latter case. +- opt = c_int.in_dll(pydll, "Py_OptimizeFlag").value +- if __debug__: +- self.assertEqual(opt, 0) +- elif ValuesTestCase.__doc__ is not None: +- self.assertEqual(opt, 1) +- else: +- self.assertEqual(opt, 2) ++ opt = c_int.in_dll(pythonapi, "Py_OptimizeFlag").value ++ self.assertEqual(opt, sys.flags.optimize) + +- def test_frozentable(self): +- # Python exports a PyImport_FrozenModules symbol. This is a +- # pointer to an array of struct _frozen entries. The end of the +- # array is marked by an entry containing a NULL name and zero +- # size. ++ def test_frozentable(self): ++ # Python exports a PyImport_FrozenModules symbol. This is a ++ # pointer to an array of struct _frozen entries. The end of the ++ # array is marked by an entry containing a NULL name and zero ++ # size. + +- # In standard Python, this table contains a __hello__ +- # module, and a __phello__ package containing a spam +- # module. +- class struct_frozen(Structure): +- _fields_ = [("name", c_char_p), +- ("code", POINTER(c_ubyte)), +- ("size", c_int)] +- FrozenTable = POINTER(struct_frozen) ++ # In standard Python, this table contains a __hello__ ++ # module, and a __phello__ package containing a spam ++ # module. ++ class struct_frozen(Structure): ++ _fields_ = [("name", c_char_p), ++ ("code", POINTER(c_ubyte)), ++ ("size", c_int)] ++ FrozenTable = POINTER(struct_frozen) + +- ft = FrozenTable.in_dll(pydll, "PyImport_FrozenModules") +- # ft is a pointer to the struct_frozen entries: +- items = [] +- for entry in ft: +- # This is dangerous. We *can* iterate over a pointer, but +- # the loop will not terminate (maybe with an access +- # violation;-) because the pointer instance has no size. +- if entry.name is None: +- break +- items.append((entry.name, entry.size)) +- import sys +- if sys.version_info[:2] >= (2, 3): +- expected = [("__hello__", 104), ("__phello__", -104), ("__phello__.spam", 104)] +- else: +- expected = [("__hello__", 100), ("__phello__", -100), ("__phello__.spam", 100)] +- self.assertEqual(items, expected) ++ ft = FrozenTable.in_dll(pythonapi, "PyImport_FrozenModules") ++ # ft is a pointer to the struct_frozen entries: ++ items = [] ++ # _frozen_importlib changes size whenever importlib._bootstrap ++ # changes, so it gets a special case. We should make sure it's ++ # found, but don't worry about its size too much. ++ _fzn_implib_seen = False ++ for entry in ft: ++ # This is dangerous. We *can* iterate over a pointer, but ++ # the loop will not terminate (maybe with an access ++ # violation;-) because the pointer instance has no size. ++ if entry.name is None: ++ break + +- from ctypes import _pointer_type_cache +- del _pointer_type_cache[struct_frozen] ++ if entry.name == b'_frozen_importlib': ++ _fzn_implib_seen = True ++ self.assertTrue(entry.size, ++ "_frozen_importlib was reported as having no size") ++ continue ++ items.append((entry.name, entry.size)) + +- def test_undefined(self): +- self.assertRaises(ValueError, c_int.in_dll, pydll, "Undefined_Symbol") ++ expected = [(b"__hello__", 161), ++ (b"__phello__", -161), ++ (b"__phello__.spam", 161), ++ ] ++ self.assertEqual(items, expected) ++ ++ self.assertTrue(_fzn_implib_seen, ++ "_frozen_importlib wasn't found in PyImport_FrozenModules") ++ ++ from ctypes import _pointer_type_cache ++ del _pointer_type_cache[struct_frozen] ++ ++ def test_undefined(self): ++ self.assertRaises(ValueError, c_int.in_dll, pythonapi, ++ "Undefined_Symbol") + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/ctypes/test/test_win32.py +--- a/Lib/ctypes/test/test_win32.py ++++ b/Lib/ctypes/test/test_win32.py +@@ -1,99 +1,105 @@ + # Windows specific tests + + from ctypes import * +-from ctypes.test import is_resource_enabled ++from ctypes.test import requires + import unittest, sys + from test import support + + import _ctypes_test + +-if sys.platform == "win32" and sizeof(c_void_p) == sizeof(c_int): +- # Only windows 32-bit has different calling conventions. ++# Only windows 32-bit has different calling conventions. ++@unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++@unittest.skipUnless(sizeof(c_void_p) == sizeof(c_int), ++ "sizeof c_void_p and c_int differ") ++class WindowsTestCase(unittest.TestCase): ++ def test_callconv_1(self): ++ # Testing stdcall function + +- class WindowsTestCase(unittest.TestCase): +- def test_callconv_1(self): +- # Testing stdcall function ++ IsWindow = windll.user32.IsWindow ++ # ValueError: Procedure probably called with not enough arguments ++ # (4 bytes missing) ++ self.assertRaises(ValueError, IsWindow) + +- IsWindow = windll.user32.IsWindow +- # ValueError: Procedure probably called with not enough arguments (4 bytes missing) +- self.assertRaises(ValueError, IsWindow) ++ # This one should succeed... ++ self.assertEqual(0, IsWindow(0)) + +- # This one should succeed... +- self.assertEqual(0, IsWindow(0)) ++ # ValueError: Procedure probably called with too many arguments ++ # (8 bytes in excess) ++ self.assertRaises(ValueError, IsWindow, 0, 0, 0) + +- # ValueError: Procedure probably called with too many arguments (8 bytes in excess) +- self.assertRaises(ValueError, IsWindow, 0, 0, 0) ++ def test_callconv_2(self): ++ # Calling stdcall function as cdecl + +- def test_callconv_2(self): +- # Calling stdcall function as cdecl ++ IsWindow = cdll.user32.IsWindow + +- IsWindow = cdll.user32.IsWindow ++ # ValueError: Procedure called with not enough arguments ++ # (4 bytes missing) or wrong calling convention ++ self.assertRaises(ValueError, IsWindow, None) + +- # ValueError: Procedure called with not enough arguments (4 bytes missing) +- # or wrong calling convention +- self.assertRaises(ValueError, IsWindow, None) ++@unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++class FunctionCallTestCase(unittest.TestCase): ++ @unittest.skipUnless('MSC' in sys.version, "SEH only supported by MSC") ++ @unittest.skipIf(sys.executable.endswith('_d.exe'), ++ "SEH not enabled in debug builds") ++ def test_SEH(self): ++ requires("SEH") ++ # Call functions with invalid arguments, and make sure ++ # that access violations are trapped and raise an ++ # exception. ++ self.assertRaises(OSError, windll.kernel32.GetModuleHandleA, 32) + +-if sys.platform == "win32": +- class FunctionCallTestCase(unittest.TestCase): ++ def test_noargs(self): ++ # This is a special case on win32 x64 ++ windll.user32.GetDesktopWindow() + +- if is_resource_enabled("SEH"): +- def test_SEH(self): +- # Call functions with invalid arguments, and make sure +- # that access violations are trapped and raise an +- # exception. +- self.assertRaises(OSError, windll.kernel32.GetModuleHandleA, 32) ++@unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++class TestWintypes(unittest.TestCase): ++ def test_HWND(self): ++ from ctypes import wintypes ++ self.assertEqual(sizeof(wintypes.HWND), sizeof(c_void_p)) + +- def test_noargs(self): +- # This is a special case on win32 x64 +- windll.user32.GetDesktopWindow() ++ def test_PARAM(self): ++ from ctypes import wintypes ++ self.assertEqual(sizeof(wintypes.WPARAM), ++ sizeof(c_void_p)) ++ self.assertEqual(sizeof(wintypes.LPARAM), ++ sizeof(c_void_p)) + +- class TestWintypes(unittest.TestCase): +- def test_HWND(self): +- from ctypes import wintypes +- self.assertEqual(sizeof(wintypes.HWND), sizeof(c_void_p)) ++ def test_COMError(self): ++ from _ctypes import COMError ++ if support.HAVE_DOCSTRINGS: ++ self.assertEqual(COMError.__doc__, ++ "Raised when a COM method call failed.") + +- def test_PARAM(self): +- from ctypes import wintypes +- self.assertEqual(sizeof(wintypes.WPARAM), +- sizeof(c_void_p)) +- self.assertEqual(sizeof(wintypes.LPARAM), +- sizeof(c_void_p)) ++ ex = COMError(-1, "text", ("details",)) ++ self.assertEqual(ex.hresult, -1) ++ self.assertEqual(ex.text, "text") ++ self.assertEqual(ex.details, ("details",)) + +- def test_COMError(self): +- from _ctypes import COMError +- if support.HAVE_DOCSTRINGS: +- self.assertEqual(COMError.__doc__, +- "Raised when a COM method call failed.") ++@unittest.skipUnless(sys.platform == "win32", 'Windows-specific test') ++class TestWinError(unittest.TestCase): ++ def test_winerror(self): ++ # see Issue 16169 ++ import errno ++ ERROR_INVALID_PARAMETER = 87 ++ msg = FormatError(ERROR_INVALID_PARAMETER).strip() ++ args = (errno.EINVAL, msg, None, ERROR_INVALID_PARAMETER) + +- ex = COMError(-1, "text", ("details",)) +- self.assertEqual(ex.hresult, -1) +- self.assertEqual(ex.text, "text") +- self.assertEqual(ex.details, ("details",)) ++ e = WinError(ERROR_INVALID_PARAMETER) ++ self.assertEqual(e.args, args) ++ self.assertEqual(e.errno, errno.EINVAL) ++ self.assertEqual(e.winerror, ERROR_INVALID_PARAMETER) + +- class TestWinError(unittest.TestCase): +- def test_winerror(self): +- # see Issue 16169 +- import errno +- ERROR_INVALID_PARAMETER = 87 +- msg = FormatError(ERROR_INVALID_PARAMETER).strip() +- args = (errno.EINVAL, msg, None, ERROR_INVALID_PARAMETER) +- +- e = WinError(ERROR_INVALID_PARAMETER) +- self.assertEqual(e.args, args) +- self.assertEqual(e.errno, errno.EINVAL) +- self.assertEqual(e.winerror, ERROR_INVALID_PARAMETER) +- +- windll.kernel32.SetLastError(ERROR_INVALID_PARAMETER) +- try: +- raise WinError() +- except OSError as exc: +- e = exc +- self.assertEqual(e.args, args) +- self.assertEqual(e.errno, errno.EINVAL) +- self.assertEqual(e.winerror, ERROR_INVALID_PARAMETER) ++ windll.kernel32.SetLastError(ERROR_INVALID_PARAMETER) ++ try: ++ raise WinError() ++ except OSError as exc: ++ e = exc ++ self.assertEqual(e.args, args) ++ self.assertEqual(e.errno, errno.EINVAL) ++ self.assertEqual(e.winerror, ERROR_INVALID_PARAMETER) + + class Structures(unittest.TestCase): +- + def test_struct_by_value(self): + class POINT(Structure): + _fields_ = [("x", c_long), +diff -r c0e311e010fc Lib/ctypes/test/test_wintypes.py +--- a/Lib/ctypes/test/test_wintypes.py ++++ b/Lib/ctypes/test/test_wintypes.py +@@ -1,14 +1,12 @@ + import sys + import unittest + +-if not sys.platform.startswith('win'): +- raise unittest.SkipTest('Windows-only test') ++from ctypes import * + +-from ctypes import * +-from ctypes import wintypes +- ++@unittest.skipUnless(sys.platform.startswith('win'), 'Windows-only test') + class WinTypesTest(unittest.TestCase): + def test_variant_bool(self): ++ from ctypes import wintypes + # reads 16-bits from memory, anything non-zero is True + for true_value in (1, 32767, 32768, 65535, 65537): + true = POINTER(c_int16)(c_int16(true_value)) +diff -r c0e311e010fc Lib/dbm/dumb.py +--- a/Lib/dbm/dumb.py ++++ b/Lib/dbm/dumb.py +@@ -68,9 +68,10 @@ + try: + f = _io.open(self._datfile, 'r', encoding="Latin-1") + except OSError: +- f = _io.open(self._datfile, 'w', encoding="Latin-1") +- self._chmod(self._datfile) +- f.close() ++ with _io.open(self._datfile, 'w', encoding="Latin-1") as f: ++ self._chmod(self._datfile) ++ else: ++ f.close() + self._update() + + # Read directory file into the in-memory index dict. +@@ -81,12 +82,12 @@ + except OSError: + pass + else: +- for line in f: +- line = line.rstrip() +- key, pos_and_siz_pair = eval(line) +- key = key.encode('Latin-1') +- self._index[key] = pos_and_siz_pair +- f.close() ++ with f: ++ for line in f: ++ line = line.rstrip() ++ key, pos_and_siz_pair = eval(line) ++ key = key.encode('Latin-1') ++ self._index[key] = pos_and_siz_pair + + # Write the index dict to the directory file. The original directory + # file (if any) is renamed with a .bak extension first. If a .bak +@@ -108,13 +109,13 @@ + except OSError: + pass + +- f = self._io.open(self._dirfile, 'w', encoding="Latin-1") +- self._chmod(self._dirfile) +- for key, pos_and_siz_pair in self._index.items(): +- # Use Latin-1 since it has no qualms with any value in any +- # position; UTF-8, though, does care sometimes. +- f.write("%r, %r\n" % (key.decode('Latin-1'), pos_and_siz_pair)) +- f.close() ++ with self._io.open(self._dirfile, 'w', encoding="Latin-1") as f: ++ self._chmod(self._dirfile) ++ for key, pos_and_siz_pair in self._index.items(): ++ # Use Latin-1 since it has no qualms with any value in any ++ # position; UTF-8, though, does care sometimes. ++ entry = "%r, %r\n" % (key.decode('Latin-1'), pos_and_siz_pair) ++ f.write(entry) + + sync = _commit + +@@ -127,10 +128,9 @@ + key = key.encode('utf-8') + self._verify_open() + pos, siz = self._index[key] # may raise KeyError +- f = _io.open(self._datfile, 'rb') +- f.seek(pos) +- dat = f.read(siz) +- f.close() ++ with _io.open(self._datfile, 'rb') as f: ++ f.seek(pos) ++ dat = f.read(siz) + return dat + + # Append val to the data file, starting at a _BLOCKSIZE-aligned +@@ -138,14 +138,13 @@ + # to get to an aligned offset. Return pair + # (starting offset of val, len(val)) + def _addval(self, val): +- f = _io.open(self._datfile, 'rb+') +- f.seek(0, 2) +- pos = int(f.tell()) +- npos = ((pos + _BLOCKSIZE - 1) // _BLOCKSIZE) * _BLOCKSIZE +- f.write(b'\0'*(npos-pos)) +- pos = npos +- f.write(val) +- f.close() ++ with _io.open(self._datfile, 'rb+') as f: ++ f.seek(0, 2) ++ pos = int(f.tell()) ++ npos = ((pos + _BLOCKSIZE - 1) // _BLOCKSIZE) * _BLOCKSIZE ++ f.write(b'\0'*(npos-pos)) ++ pos = npos ++ f.write(val) + return (pos, len(val)) + + # Write val to the data file, starting at offset pos. The caller +@@ -153,10 +152,9 @@ + # pos to hold val, without overwriting some other value. Return + # pair (pos, len(val)). + def _setval(self, pos, val): +- f = _io.open(self._datfile, 'rb+') +- f.seek(pos) +- f.write(val) +- f.close() ++ with _io.open(self._datfile, 'rb+') as f: ++ f.seek(pos) ++ f.write(val) + return (pos, len(val)) + + # key is a new key whose associated value starts in the data file +@@ -164,10 +162,9 @@ + # the in-memory index dict, and append one to the directory file. + def _addkey(self, key, pos_and_siz_pair): + self._index[key] = pos_and_siz_pair +- f = _io.open(self._dirfile, 'a', encoding="Latin-1") +- self._chmod(self._dirfile) +- f.write("%r, %r\n" % (key.decode("Latin-1"), pos_and_siz_pair)) +- f.close() ++ with _io.open(self._dirfile, 'a', encoding="Latin-1") as f: ++ self._chmod(self._dirfile) ++ f.write("%r, %r\n" % (key.decode("Latin-1"), pos_and_siz_pair)) + + def __setitem__(self, key, val): + if isinstance(key, str): +@@ -216,8 +213,10 @@ + self._commit() + + def keys(self): +- self._verify_open() +- return list(self._index.keys()) ++ try: ++ return list(self._index) ++ except TypeError: ++ raise error('DBM object has already been closed') from None + + def items(self): + self._verify_open() +@@ -226,17 +225,26 @@ + def __contains__(self, key): + if isinstance(key, str): + key = key.encode('utf-8') +- self._verify_open() +- return key in self._index ++ try: ++ return key in self._index ++ except TypeError: ++ if self._index is None: ++ raise error('DBM object has already been closed') from None ++ else: ++ raise + + def iterkeys(self): +- self._verify_open() +- return iter(self._index.keys()) ++ try: ++ return iter(self._index) ++ except TypeError: ++ raise error('DBM object has already been closed') from None + __iter__ = iterkeys + + def __len__(self): +- self._verify_open() +- return len(self._index) ++ try: ++ return len(self._index) ++ except TypeError: ++ raise error('DBM object has already been closed') from None + + def close(self): + self._commit() +diff -r c0e311e010fc Lib/difflib.py +--- a/Lib/difflib.py ++++ b/Lib/difflib.py +@@ -511,8 +511,8 @@ + non_adjacent.append((i1, j1, k1)) + + non_adjacent.append( (la, lb, 0) ) +- self.matching_blocks = non_adjacent +- return map(Match._make, self.matching_blocks) ++ self.matching_blocks = list(map(Match._make, non_adjacent)) ++ return self.matching_blocks + + def get_opcodes(self): + """Return list of 5-tuples describing how to turn a into b. +diff -r c0e311e010fc Lib/distutils/command/upload.py +--- a/Lib/distutils/command/upload.py ++++ b/Lib/distutils/command/upload.py +@@ -2,10 +2,6 @@ + + Implements the Distutils 'upload' subcommand (upload package to PyPI).""" + +-from distutils.errors import * +-from distutils.core import PyPIRCCommand +-from distutils.spawn import spawn +-from distutils import log + import sys + import os, io + import socket +@@ -13,6 +9,10 @@ + from base64 import standard_b64encode + from urllib.request import urlopen, Request, HTTPError + from urllib.parse import urlparse ++from distutils.errors import DistutilsError, DistutilsOptionError ++from distutils.core import PyPIRCCommand ++from distutils.spawn import spawn ++from distutils import log + + # this keeps compatibility for 2.3 and 2.4 + if sys.version < "2.5": +@@ -184,7 +184,7 @@ + reason = result.msg + except OSError as e: + self.announce(str(e), log.ERROR) +- return ++ raise + except HTTPError as e: + status = e.code + reason = e.msg +@@ -193,8 +193,9 @@ + self.announce('Server response (%s): %s' % (status, reason), + log.INFO) + else: +- self.announce('Upload failed (%s): %s' % (status, reason), +- log.ERROR) ++ msg = 'Upload failed (%s): %s' % (status, reason) ++ self.announce(msg, log.ERROR) ++ raise DistutilsError(msg) + if self.show_response: + text = self._read_pypi_response(result) + msg = '\n'.join(('-' * 75, text, '-' * 75)) +diff -r c0e311e010fc Lib/distutils/sysconfig.py +--- a/Lib/distutils/sysconfig.py ++++ b/Lib/distutils/sysconfig.py +@@ -179,7 +179,8 @@ + # version and build tools may not support the same set + # of CPU architectures for universal builds. + global _config_vars +- if not _config_vars.get('CUSTOMIZED_OSX_COMPILER', ''): ++ # Use get_config_var() to ensure _config_vars is initialized. ++ if not get_config_var('CUSTOMIZED_OSX_COMPILER'): + import _osx_support + _osx_support.customize_compiler(_config_vars) + _config_vars['CUSTOMIZED_OSX_COMPILER'] = 'True' +diff -r c0e311e010fc Lib/distutils/tests/test_build_ext.py +--- a/Lib/distutils/tests/test_build_ext.py ++++ b/Lib/distutils/tests/test_build_ext.py +@@ -444,8 +444,16 @@ + + # get the deployment target that the interpreter was built with + target = sysconfig.get_config_var('MACOSX_DEPLOYMENT_TARGET') +- target = tuple(map(int, target.split('.'))) +- target = '%02d%01d0' % target ++ target = tuple(map(int, target.split('.')[0:2])) ++ # format the target value as defined in the Apple ++ # Availability Macros. We can't use the macro names since ++ # at least one value we test with will not exist yet. ++ if target[1] < 10: ++ # for 10.1 through 10.9.x -> "10n0" ++ target = '%02d%01d0' % target ++ else: ++ # for 10.10 and beyond -> "10nn00" ++ target = '%02d%02d00' % target + deptarget_ext = Extension( + 'deptarget', + [deptarget_c], +diff -r c0e311e010fc Lib/distutils/tests/test_sysconfig.py +--- a/Lib/distutils/tests/test_sysconfig.py ++++ b/Lib/distutils/tests/test_sysconfig.py +@@ -1,6 +1,9 @@ + """Tests for distutils.sysconfig.""" + import os + import shutil ++import subprocess ++import sys ++import textwrap + import unittest + + from distutils import sysconfig +@@ -174,6 +177,25 @@ + self.assertIsNotNone(vars['SO']) + self.assertEqual(vars['SO'], vars['EXT_SUFFIX']) + ++ def test_customize_compiler_before_get_config_vars(self): ++ # Issue #21923: test that a Distribution compiler ++ # instance can be called without an explicit call to ++ # get_config_vars(). ++ with open(TESTFN, 'w') as f: ++ f.writelines(textwrap.dedent('''\ ++ from distutils.core import Distribution ++ config = Distribution().get_command_obj('config') ++ # try_compile may pass or it may fail if no compiler ++ # is found but it should not raise an exception. ++ rc = config.try_compile('int x;') ++ ''')) ++ p = subprocess.Popen([str(sys.executable), TESTFN], ++ stdout=subprocess.PIPE, ++ stderr=subprocess.STDOUT, ++ universal_newlines=True) ++ outs, errs = p.communicate() ++ self.assertEqual(0, p.returncode, "Subprocess failed: " + outs) ++ + + def test_suite(): + suite = unittest.TestSuite() +diff -r c0e311e010fc Lib/distutils/tests/test_upload.py +--- a/Lib/distutils/tests/test_upload.py ++++ b/Lib/distutils/tests/test_upload.py +@@ -6,6 +6,7 @@ + from distutils.command import upload as upload_mod + from distutils.command.upload import upload + from distutils.core import Distribution ++from distutils.errors import DistutilsError + from distutils.log import INFO + + from distutils.tests.test_config import PYPIRC, PyPIRCCommandTestCase +@@ -41,13 +42,14 @@ + + class FakeOpen(object): + +- def __init__(self, url): ++ def __init__(self, url, msg=None, code=None): + self.url = url + if not isinstance(url, str): + self.req = url + else: + self.req = None +- self.msg = 'OK' ++ self.msg = msg or 'OK' ++ self.code = code or 200 + + def getheader(self, name, default=None): + return { +@@ -58,7 +60,7 @@ + return b'xyzzy' + + def getcode(self): +- return 200 ++ return self.code + + + class uploadTestCase(PyPIRCCommandTestCase): +@@ -68,13 +70,15 @@ + self.old_open = upload_mod.urlopen + upload_mod.urlopen = self._urlopen + self.last_open = None ++ self.next_msg = None ++ self.next_code = None + + def tearDown(self): + upload_mod.urlopen = self.old_open + super(uploadTestCase, self).tearDown() + + def _urlopen(self, url): +- self.last_open = FakeOpen(url) ++ self.last_open = FakeOpen(url, msg=self.next_msg, code=self.next_code) + return self.last_open + + def test_finalize_options(self): +@@ -135,6 +139,10 @@ + results = self.get_logs(INFO) + self.assertIn('xyzzy\n', results[-1]) + ++ def test_upload_fails(self): ++ self.next_msg = "Not Found" ++ self.next_code = 404 ++ self.assertRaises(DistutilsError, self.test_upload) + + def test_suite(): + return unittest.makeSuite(uploadTestCase) +diff -r c0e311e010fc Lib/email/parser.py +--- a/Lib/email/parser.py ++++ b/Lib/email/parser.py +@@ -106,8 +106,10 @@ + meaning it parses the entire contents of the file. + """ + fp = TextIOWrapper(fp, encoding='ascii', errors='surrogateescape') +- with fp: ++ try: + return self.parser.parse(fp, headersonly) ++ finally: ++ fp.detach() + + + def parsebytes(self, text, headersonly=False): +diff -r c0e311e010fc Lib/http/cookiejar.py +--- a/Lib/http/cookiejar.py ++++ b/Lib/http/cookiejar.py +@@ -1722,12 +1722,12 @@ + def __repr__(self): + r = [] + for cookie in self: r.append(repr(cookie)) +- return "<%s[%s]>" % (self.__class__, ", ".join(r)) ++ return "<%s[%s]>" % (self.__class__.__name__, ", ".join(r)) + + def __str__(self): + r = [] + for cookie in self: r.append(str(cookie)) +- return "<%s[%s]>" % (self.__class__, ", ".join(r)) ++ return "<%s[%s]>" % (self.__class__.__name__, ", ".join(r)) + + + # derives from OSError for backwards-compatibility with Python 2.4.0 +diff -r c0e311e010fc Lib/http/server.py +--- a/Lib/http/server.py ++++ b/Lib/http/server.py +@@ -977,7 +977,7 @@ + (and the next character is a '/' or the end of the string). + + """ +- collapsed_path = _url_collapse_path(self.path) ++ collapsed_path = _url_collapse_path(urllib.parse.unquote(self.path)) + dir_sep = collapsed_path.find('/', 1) + head, tail = collapsed_path[:dir_sep], collapsed_path[dir_sep+1:] + if head in self.cgi_directories: +@@ -1000,16 +1000,16 @@ + def run_cgi(self): + """Execute a CGI script.""" + dir, rest = self.cgi_info +- +- i = rest.find('/') ++ path = dir + '/' + rest ++ i = path.find('/', len(dir)+1) + while i >= 0: +- nextdir = rest[:i] +- nextrest = rest[i+1:] ++ nextdir = path[:i] ++ nextrest = path[i+1:] + + scriptdir = self.translate_path(nextdir) + if os.path.isdir(scriptdir): + dir, rest = nextdir, nextrest +- i = rest.find('/') ++ i = path.find('/', len(dir)+1) + else: + break + +diff -r c0e311e010fc Lib/idlelib/AutoComplete.py +--- a/Lib/idlelib/AutoComplete.py ++++ b/Lib/idlelib/AutoComplete.py +@@ -226,3 +226,8 @@ + namespace = sys.modules.copy() + namespace.update(__main__.__dict__) + return eval(name, namespace) ++ ++ ++if __name__ == '__main__': ++ from unittest import main ++ main('idlelib.idle_test.test_autocomplete', verbosity=2) +diff -r c0e311e010fc Lib/idlelib/AutoExpand.py +--- a/Lib/idlelib/AutoExpand.py ++++ b/Lib/idlelib/AutoExpand.py +@@ -1,3 +1,17 @@ ++'''Complete the current word before the cursor with words in the editor. ++ ++Each menu selection or shortcut key selection replaces the word with a ++different word with the same prefix. The search for matches begins ++before the target and moves toward the top of the editor. It then starts ++after the cursor and moves down. It then returns to the original word and ++the cycle starts again. ++ ++Changing the current text line or leaving the cursor in a different ++place before requesting the next selection causes AutoExpand to reset ++its state. ++ ++This is an extension file and there is only one instance of AutoExpand. ++''' + import string + import re + +@@ -20,6 +34,7 @@ + self.state = None + + def expand_word_event(self, event): ++ "Replace the current word with the next expansion." + curinsert = self.text.index("insert") + curline = self.text.get("insert linestart", "insert lineend") + if not self.state: +@@ -46,6 +61,7 @@ + return "break" + + def getwords(self): ++ "Return a list of words that match the prefix before the cursor." + word = self.getprevword() + if not word: + return [] +@@ -76,8 +92,13 @@ + return words + + def getprevword(self): ++ "Return the word prefix before the cursor." + line = self.text.get("insert linestart", "insert") + i = len(line) + while i > 0 and line[i-1] in self.wordchars: + i = i-1 + return line[i:] ++ ++if __name__ == '__main__': ++ import unittest ++ unittest.main('idlelib.idle_test.test_autoexpand', verbosity=2) +diff -r c0e311e010fc Lib/idlelib/CallTipWindow.py +--- a/Lib/idlelib/CallTipWindow.py ++++ b/Lib/idlelib/CallTipWindow.py +@@ -2,9 +2,8 @@ + + After ToolTip.py, which uses ideas gleaned from PySol + Used by the CallTips IDLE extension. +- + """ +-from tkinter import * ++from tkinter import Toplevel, Label, LEFT, SOLID, TclError + + HIDE_VIRTUAL_EVENT_NAME = "<>" + HIDE_SEQUENCES = ("", "") +@@ -133,37 +132,39 @@ + return bool(self.tipwindow) + + ++def _calltip_window(parent): # htest # ++ import re ++ from tkinter import Tk, Text, LEFT, BOTH + +-############################### +-# +-# Test Code +-# +-class container: # Conceptually an editor_window +- def __init__(self): +- root = Tk() +- text = self.text = Text(root) +- text.pack(side=LEFT, fill=BOTH, expand=1) +- text.insert("insert", "string.split") +- root.update() +- self.calltip = CallTip(text) ++ root = Tk() ++ root.title("Test calltips") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + +- text.event_add("<>", "(") +- text.event_add("<>", ")") +- text.bind("<>", self.calltip_show) +- text.bind("<>", self.calltip_hide) ++ class MyEditWin: # conceptually an editor_window ++ def __init__(self): ++ text = self.text = Text(root) ++ text.pack(side=LEFT, fill=BOTH, expand=1) ++ text.insert("insert", "string.split") ++ root.update() ++ self.calltip = CallTip(text) + +- text.focus_set() +- root.mainloop() ++ text.event_add("<>", "(") ++ text.event_add("<>", ")") ++ text.bind("<>", self.calltip_show) ++ text.bind("<>", self.calltip_hide) + +- def calltip_show(self, event): +- self.calltip.showtip("Hello world") ++ text.focus_set() ++ root.mainloop() + +- def calltip_hide(self, event): +- self.calltip.hidetip() ++ def calltip_show(self, event): ++ self.calltip.showtip("Hello world", "insert", "end") + +-def main(): +- # Test code +- c=container() ++ def calltip_hide(self, event): ++ self.calltip.hidetip() ++ ++ MyEditWin() + + if __name__=='__main__': +- main() ++ from idlelib.idle_test.htest import run ++ run(_calltip_window) +diff -r c0e311e010fc Lib/idlelib/ClassBrowser.py +--- a/Lib/idlelib/ClassBrowser.py ++++ b/Lib/idlelib/ClassBrowser.py +@@ -21,11 +21,15 @@ + + class ClassBrowser: + +- def __init__(self, flist, name, path): ++ def __init__(self, flist, name, path, _htest=False): + # XXX This API should change, if the file doesn't end in ".py" + # XXX the code here is bogus! ++ """ ++ _htest - bool, change box when location running htest. ++ """ + self.name = name + self.file = os.path.join(path[0], self.name + ".py") ++ self._htest = _htest + self.init(flist) + + def close(self, event=None): +@@ -40,6 +44,9 @@ + self.top = top = ListedToplevel(flist.root) + top.protocol("WM_DELETE_WINDOW", self.close) + top.bind("", self.close) ++ if self._htest: # place dialog below parent if running htest ++ top.geometry("+%d+%d" % ++ (flist.root.winfo_rootx(), flist.root.winfo_rooty() + 200)) + self.settitle() + top.focus_set() + # create scrolled canvas +@@ -94,7 +101,7 @@ + return [] + try: + dict = pyclbr.readmodule_ex(name, [dir] + sys.path) +- except ImportError as msg: ++ except ImportError: + return [] + items = [] + self.classes = {} +@@ -202,7 +209,7 @@ + edit = PyShell.flist.open(self.file) + edit.gotoline(self.cl.methods[self.name]) + +-def main(): ++def _class_browser(parent): #Wrapper for htest + try: + file = __file__ + except NameError: +@@ -213,9 +220,10 @@ + file = sys.argv[0] + dir, file = os.path.split(file) + name = os.path.splitext(file)[0] +- ClassBrowser(PyShell.flist, name, [dir]) +- if sys.stdin is sys.__stdin__: +- mainloop() ++ flist = PyShell.PyShellFileList(parent) ++ ClassBrowser(flist, name, [dir], _htest=True) ++ parent.mainloop() + + if __name__ == "__main__": +- main() ++ from idlelib.idle_test.htest import run ++ run(_class_browser) +diff -r c0e311e010fc Lib/idlelib/ColorDelegator.py +--- a/Lib/idlelib/ColorDelegator.py ++++ b/Lib/idlelib/ColorDelegator.py +@@ -253,17 +253,21 @@ + for tag in self.tagdefs: + self.tag_remove(tag, "1.0", "end") + +-def main(): ++def _color_delegator(parent): + from idlelib.Percolator import Percolator + root = Tk() +- root.wm_protocol("WM_DELETE_WINDOW", root.quit) +- text = Text(background="white") ++ root.title("Test ColorDelegator") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ source = "if somename: x = 'abc' # comment\nprint" ++ text = Text(root, background="white") ++ text.insert("insert", source) + text.pack(expand=1, fill="both") +- text.focus_set() + p = Percolator(text) + d = ColorDelegator() + p.insertfilter(d) + root.mainloop() + + if __name__ == "__main__": +- main() ++ from idlelib.idle_test.htest import run ++ run(_color_delegator) +diff -r c0e311e010fc Lib/idlelib/Debugger.py +--- a/Lib/idlelib/Debugger.py ++++ b/Lib/idlelib/Debugger.py +@@ -1,6 +1,5 @@ + import os + import bdb +-import types + from tkinter import * + from idlelib.WindowList import ListedToplevel + from idlelib.ScrolledList import ScrolledList +diff -r c0e311e010fc Lib/idlelib/EditorWindow.py +--- a/Lib/idlelib/EditorWindow.py ++++ b/Lib/idlelib/EditorWindow.py +@@ -79,7 +79,7 @@ + self.parent = None + + helpDialog = HelpDialog() # singleton instance +-def _Help_dialog(parent): # wrapper for htest ++def _help_dialog(parent): # wrapper for htest + helpDialog.show_dialog(parent) + + +@@ -1702,21 +1702,18 @@ + tk.call('set', 'tcl_nonwordchars', '[^a-zA-Z0-9_]') + + +-def _Editor_window(parent): ++def _editor_window(parent): + root = parent + fixwordbreaks(root) +- root.withdraw() + if sys.argv[1:]: + filename = sys.argv[1] + else: + filename = None + macosxSupport.setupApp(root, None) + edit = EditorWindow(root=root, filename=filename) +- edit.set_close_hook(root.quit) + edit.text.bind("<>", edit.close_event) ++ parent.mainloop() + + if __name__ == '__main__': + from idlelib.idle_test.htest import run +- if len(sys.argv) <= 1: +- run(_Help_dialog) +- run(_Editor_window) ++ run(_help_dialog, _editor_window) +diff -r c0e311e010fc Lib/idlelib/FormatParagraph.py +--- a/Lib/idlelib/FormatParagraph.py ++++ b/Lib/idlelib/FormatParagraph.py +@@ -188,7 +188,6 @@ + return m.group(1) + + if __name__ == "__main__": +- from test import support; support.use_resources = ['gui'] + import unittest + unittest.main('idlelib.idle_test.test_formatparagraph', + verbosity=2, exit=False) +diff -r c0e311e010fc Lib/idlelib/GrepDialog.py +--- a/Lib/idlelib/GrepDialog.py ++++ b/Lib/idlelib/GrepDialog.py +@@ -1,9 +1,14 @@ + import os + import fnmatch ++import re # for htest + import sys +-from tkinter import * ++from tkinter import StringVar, BooleanVar, Checkbutton # for GrepDialog ++from tkinter import Tk, Text, Button, SEL, END # for htest + from idlelib import SearchEngine ++import itertools + from idlelib.SearchDialogBase import SearchDialogBase ++# Importing OutputWindow fails due to import loop ++# EditorWindow -> GrepDialop -> OutputWindow -> EditorWindow + + def grep(text, io=None, flist=None): + root = text._root() +@@ -40,10 +45,10 @@ + + def create_entries(self): + SearchDialogBase.create_entries(self) +- self.globent = self.make_entry("In files:", self.globvar) ++ self.globent = self.make_entry("In files:", self.globvar)[0] + + def create_other_buttons(self): +- f = self.make_frame() ++ f = self.make_frame()[0] + + btn = Checkbutton(f, anchor="w", + variable=self.recvar, +@@ -63,7 +68,7 @@ + if not path: + self.top.bell() + return +- from idlelib.OutputWindow import OutputWindow ++ from idlelib.OutputWindow import OutputWindow # leave here! + save = sys.stdout + try: + sys.stdout = OutputWindow(self.flist) +@@ -79,21 +84,26 @@ + pat = self.engine.getpat() + print("Searching %r in %s ..." % (pat, path)) + hits = 0 +- for fn in list: +- try: +- with open(fn, errors='replace') as f: +- for lineno, line in enumerate(f, 1): +- if line[-1:] == '\n': +- line = line[:-1] +- if prog.search(line): +- sys.stdout.write("%s: %s: %s\n" % +- (fn, lineno, line)) +- hits += 1 +- except OSError as msg: +- print(msg) +- print(("Hits found: %s\n" +- "(Hint: right-click to open locations.)" +- % hits) if hits else "No hits.") ++ try: ++ for fn in list: ++ try: ++ with open(fn, errors='replace') as f: ++ for lineno, line in enumerate(f, 1): ++ if line[-1:] == '\n': ++ line = line[:-1] ++ if prog.search(line): ++ sys.stdout.write("%s: %s: %s\n" % ++ (fn, lineno, line)) ++ hits += 1 ++ except OSError as msg: ++ print(msg) ++ print(("Hits found: %s\n" ++ "(Hint: right-click to open locations.)" ++ % hits) if hits else "No hits.") ++ except AttributeError: ++ # Tk window has been closed, OutputWindow.text = None, ++ # so in OW.write, OW.text.insert fails. ++ pass + + def findfiles(self, dir, base, rec): + try: +@@ -120,8 +130,30 @@ + self.top.grab_release() + self.top.withdraw() + ++ ++def _grep_dialog(parent): # for htest ++ from idlelib.PyShell import PyShellFileList ++ root = Tk() ++ root.title("Test GrepDialog") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ ++ flist = PyShellFileList(root) ++ text = Text(root, height=5) ++ text.pack() ++ ++ def show_grep_dialog(): ++ text.tag_add(SEL, "1.0", END) ++ grep(text, flist=flist) ++ text.tag_remove(SEL, "1.0", END) ++ ++ button = Button(root, text="Show GrepDialog", command=show_grep_dialog) ++ button.pack() ++ root.mainloop() ++ + if __name__ == "__main__": +- # A human test is a bit tricky since EditorWindow() imports this module. +- # Hence Idle must be restarted after editing this file for a live test. + import unittest + unittest.main('idlelib.idle_test.test_grep', verbosity=2, exit=False) ++ ++ from idlelib.idle_test.htest import run ++ run(_grep_dialog) +diff -r c0e311e010fc Lib/idlelib/HyperParser.py +--- a/Lib/idlelib/HyperParser.py ++++ b/Lib/idlelib/HyperParser.py +@@ -1,23 +1,31 @@ +-""" +-HyperParser +-=========== +-This module defines the HyperParser class, which provides advanced parsing +-abilities for the ParenMatch and other extensions. +-The HyperParser uses PyParser. PyParser is intended mostly to give information +-on the proper indentation of code. HyperParser gives some information on the +-structure of code, used by extensions to help the user. ++"""Provide advanced parsing abilities for ParenMatch and other extensions. ++ ++HyperParser uses PyParser. PyParser mostly gives information on the ++proper indentation of code. HyperParser gives additional information on ++the structure of code. + """ + + import string +-import keyword ++from keyword import iskeyword + from idlelib import PyParse + ++ ++# all ASCII chars that may be in an identifier ++_ASCII_ID_CHARS = frozenset(string.ascii_letters + string.digits + "_") ++# all ASCII chars that may be the first char of an identifier ++_ASCII_ID_FIRST_CHARS = frozenset(string.ascii_letters + "_") ++ ++# lookup table for whether 7-bit ASCII chars are valid in a Python identifier ++_IS_ASCII_ID_CHAR = [(chr(x) in _ASCII_ID_CHARS) for x in range(128)] ++# lookup table for whether 7-bit ASCII chars are valid as the first ++# char in a Python identifier ++_IS_ASCII_ID_FIRST_CHAR = \ ++ [(chr(x) in _ASCII_ID_FIRST_CHARS) for x in range(128)] ++ ++ + class HyperParser: +- + def __init__(self, editwin, index): +- """Initialize the HyperParser to analyze the surroundings of the given +- index. +- """ ++ "To initialize, analyze the surroundings of the given index." + + self.editwin = editwin + self.text = text = editwin.text +@@ -33,9 +41,10 @@ + startat = max(lno - context, 1) + startatindex = repr(startat) + ".0" + stopatindex = "%d.end" % lno +- # We add the newline because PyParse requires a newline at end. +- # We add a space so that index won't be at end of line, so that +- # its status will be the same as the char before it, if should. ++ # We add the newline because PyParse requires a newline ++ # at end. We add a space so that index won't be at end ++ # of line, so that its status will be the same as the ++ # char before it, if should. + parser.set_str(text.get(startatindex, stopatindex)+' \n') + bod = parser.find_good_parse_start( + editwin._build_char_in_string_func(startatindex)) +@@ -49,122 +58,175 @@ + else: + startatindex = "1.0" + stopatindex = "%d.end" % lno +- # We add the newline because PyParse requires a newline at end. +- # We add a space so that index won't be at end of line, so that +- # its status will be the same as the char before it, if should. ++ # We add the newline because PyParse requires it. We add a ++ # space so that index won't be at end of line, so that its ++ # status will be the same as the char before it, if should. + parser.set_str(text.get(startatindex, stopatindex)+' \n') + parser.set_lo(0) + +- # We want what the parser has, except for the last newline and space. ++ # We want what the parser has, minus the last newline and space. + self.rawtext = parser.str[:-2] +- # As far as I can see, parser.str preserves the statement we are in, +- # so that stopatindex can be used to synchronize the string with the +- # text box indices. ++ # Parser.str apparently preserves the statement we are in, so ++ # that stopatindex can be used to synchronize the string with ++ # the text box indices. + self.stopatindex = stopatindex + self.bracketing = parser.get_last_stmt_bracketing() +- # find which pairs of bracketing are openers. These always correspond +- # to a character of rawtext. +- self.isopener = [i>0 and self.bracketing[i][1] > self.bracketing[i-1][1] ++ # find which pairs of bracketing are openers. These always ++ # correspond to a character of rawtext. ++ self.isopener = [i>0 and self.bracketing[i][1] > ++ self.bracketing[i-1][1] + for i in range(len(self.bracketing))] + + self.set_index(index) + + def set_index(self, index): +- """Set the index to which the functions relate. Note that it must be +- in the same statement. ++ """Set the index to which the functions relate. ++ ++ The index must be in the same statement. + """ +- indexinrawtext = \ +- len(self.rawtext) - len(self.text.get(index, self.stopatindex)) ++ indexinrawtext = (len(self.rawtext) - ++ len(self.text.get(index, self.stopatindex))) + if indexinrawtext < 0: +- raise ValueError("The index given is before the analyzed statement") ++ raise ValueError("Index %s precedes the analyzed statement" ++ % index) + self.indexinrawtext = indexinrawtext + # find the rightmost bracket to which index belongs + self.indexbracket = 0 +- while self.indexbracket < len(self.bracketing)-1 and \ +- self.bracketing[self.indexbracket+1][0] < self.indexinrawtext: ++ while (self.indexbracket < len(self.bracketing)-1 and ++ self.bracketing[self.indexbracket+1][0] < self.indexinrawtext): + self.indexbracket += 1 +- if self.indexbracket < len(self.bracketing)-1 and \ +- self.bracketing[self.indexbracket+1][0] == self.indexinrawtext and \ +- not self.isopener[self.indexbracket+1]: ++ if (self.indexbracket < len(self.bracketing)-1 and ++ self.bracketing[self.indexbracket+1][0] == self.indexinrawtext and ++ not self.isopener[self.indexbracket+1]): + self.indexbracket += 1 + + def is_in_string(self): +- """Is the index given to the HyperParser is in a string?""" ++ """Is the index given to the HyperParser in a string?""" + # The bracket to which we belong should be an opener. + # If it's an opener, it has to have a character. +- return self.isopener[self.indexbracket] and \ +- self.rawtext[self.bracketing[self.indexbracket][0]] in ('"', "'") ++ return (self.isopener[self.indexbracket] and ++ self.rawtext[self.bracketing[self.indexbracket][0]] ++ in ('"', "'")) + + def is_in_code(self): +- """Is the index given to the HyperParser is in a normal code?""" +- return not self.isopener[self.indexbracket] or \ +- self.rawtext[self.bracketing[self.indexbracket][0]] not in \ +- ('#', '"', "'") ++ """Is the index given to the HyperParser in normal code?""" ++ return (not self.isopener[self.indexbracket] or ++ self.rawtext[self.bracketing[self.indexbracket][0]] ++ not in ('#', '"', "'")) + + def get_surrounding_brackets(self, openers='([{', mustclose=False): +- """If the index given to the HyperParser is surrounded by a bracket +- defined in openers (or at least has one before it), return the +- indices of the opening bracket and the closing bracket (or the +- end of line, whichever comes first). +- If it is not surrounded by brackets, or the end of line comes before +- the closing bracket and mustclose is True, returns None. ++ """Return bracket indexes or None. ++ ++ If the index given to the HyperParser is surrounded by a ++ bracket defined in openers (or at least has one before it), ++ return the indices of the opening bracket and the closing ++ bracket (or the end of line, whichever comes first). ++ ++ If it is not surrounded by brackets, or the end of line comes ++ before the closing bracket and mustclose is True, returns None. + """ ++ + bracketinglevel = self.bracketing[self.indexbracket][1] + before = self.indexbracket +- while not self.isopener[before] or \ +- self.rawtext[self.bracketing[before][0]] not in openers or \ +- self.bracketing[before][1] > bracketinglevel: ++ while (not self.isopener[before] or ++ self.rawtext[self.bracketing[before][0]] not in openers or ++ self.bracketing[before][1] > bracketinglevel): + before -= 1 + if before < 0: + return None + bracketinglevel = min(bracketinglevel, self.bracketing[before][1]) + after = self.indexbracket + 1 +- while after < len(self.bracketing) and \ +- self.bracketing[after][1] >= bracketinglevel: ++ while (after < len(self.bracketing) and ++ self.bracketing[after][1] >= bracketinglevel): + after += 1 + + beforeindex = self.text.index("%s-%dc" % + (self.stopatindex, len(self.rawtext)-self.bracketing[before][0])) +- if after >= len(self.bracketing) or \ +- self.bracketing[after][0] > len(self.rawtext): ++ if (after >= len(self.bracketing) or ++ self.bracketing[after][0] > len(self.rawtext)): + if mustclose: + return None + afterindex = self.stopatindex + else: +- # We are after a real char, so it is a ')' and we give the index +- # before it. +- afterindex = self.text.index("%s-%dc" % +- (self.stopatindex, ++ # We are after a real char, so it is a ')' and we give the ++ # index before it. ++ afterindex = self.text.index( ++ "%s-%dc" % (self.stopatindex, + len(self.rawtext)-(self.bracketing[after][0]-1))) + + return beforeindex, afterindex + ++ # the set of built-in identifiers which are also keywords, ++ # i.e. keyword.iskeyword() returns True for them ++ _ID_KEYWORDS = frozenset({"True", "False", "None"}) ++ ++ @classmethod ++ def _eat_identifier(cls, str, limit, pos): ++ """Given a string and pos, return the number of chars in the ++ identifier which ends at pos, or 0 if there is no such one. ++ ++ This ignores non-identifier eywords are not identifiers. ++ """ ++ is_ascii_id_char = _IS_ASCII_ID_CHAR ++ ++ # Start at the end (pos) and work backwards. ++ i = pos ++ ++ # Go backwards as long as the characters are valid ASCII ++ # identifier characters. This is an optimization, since it ++ # is faster in the common case where most of the characters ++ # are ASCII. ++ while i > limit and ( ++ ord(str[i - 1]) < 128 and ++ is_ascii_id_char[ord(str[i - 1])] ++ ): ++ i -= 1 ++ ++ # If the above loop ended due to reaching a non-ASCII ++ # character, continue going backwards using the most generic ++ # test for whether a string contains only valid identifier ++ # characters. ++ if i > limit and ord(str[i - 1]) >= 128: ++ while i - 4 >= limit and ('a' + str[i - 4:pos]).isidentifier(): ++ i -= 4 ++ if i - 2 >= limit and ('a' + str[i - 2:pos]).isidentifier(): ++ i -= 2 ++ if i - 1 >= limit and ('a' + str[i - 1:pos]).isidentifier(): ++ i -= 1 ++ ++ # The identifier candidate starts here. If it isn't a valid ++ # identifier, don't eat anything. At this point that is only ++ # possible if the first character isn't a valid first ++ # character for an identifier. ++ if not str[i:pos].isidentifier(): ++ return 0 ++ elif i < pos: ++ # All characters in str[i:pos] are valid ASCII identifier ++ # characters, so it is enough to check that the first is ++ # valid as the first character of an identifier. ++ if not _IS_ASCII_ID_FIRST_CHAR[ord(str[i])]: ++ return 0 ++ ++ # All keywords are valid identifiers, but should not be ++ # considered identifiers here, except for True, False and None. ++ if i < pos and ( ++ iskeyword(str[i:pos]) and ++ str[i:pos] not in cls._ID_KEYWORDS ++ ): ++ return 0 ++ ++ return pos - i ++ + # This string includes all chars that may be in a white space + _whitespace_chars = " \t\n\\" +- # This string includes all chars that may be in an identifier +- _id_chars = string.ascii_letters + string.digits + "_" +- # This string includes all chars that may be the first char of an identifier +- _id_first_chars = string.ascii_letters + "_" +- +- # Given a string and pos, return the number of chars in the identifier +- # which ends at pos, or 0 if there is no such one. Saved words are not +- # identifiers. +- def _eat_identifier(self, str, limit, pos): +- i = pos +- while i > limit and str[i-1] in self._id_chars: +- i -= 1 +- if i < pos and (str[i] not in self._id_first_chars or \ +- keyword.iskeyword(str[i:pos])): +- i = pos +- return pos - i + + def get_expression(self): +- """Return a string with the Python expression which ends at the given +- index, which is empty if there is no real one. ++ """Return a string with the Python expression which ends at the ++ given index, which is empty if there is no real one. + """ + if not self.is_in_code(): +- raise ValueError("get_expression should only be called if index "\ +- "is inside a code.") ++ raise ValueError("get_expression should only be called" ++ "if index is inside a code.") + + rawtext = self.rawtext + bracketing = self.bracketing +@@ -177,20 +239,20 @@ + postdot_phase = True + + while 1: +- # Eat whitespaces, comments, and if postdot_phase is False - one dot ++ # Eat whitespaces, comments, and if postdot_phase is False - a dot + while 1: + if pos>brck_limit and rawtext[pos-1] in self._whitespace_chars: + # Eat a whitespace + pos -= 1 +- elif not postdot_phase and \ +- pos > brck_limit and rawtext[pos-1] == '.': ++ elif (not postdot_phase and ++ pos > brck_limit and rawtext[pos-1] == '.'): + # Eat a dot + pos -= 1 + postdot_phase = True +- # The next line will fail if we are *inside* a comment, but we +- # shouldn't be. +- elif pos == brck_limit and brck_index > 0 and \ +- rawtext[bracketing[brck_index-1][0]] == '#': ++ # The next line will fail if we are *inside* a comment, ++ # but we shouldn't be. ++ elif (pos == brck_limit and brck_index > 0 and ++ rawtext[bracketing[brck_index-1][0]] == '#'): + # Eat a comment + brck_index -= 2 + brck_limit = bracketing[brck_index][0] +@@ -200,8 +262,8 @@ + break + + if not postdot_phase: +- # We didn't find a dot, so the expression end at the last +- # identifier pos. ++ # We didn't find a dot, so the expression end at the ++ # last identifier pos. + break + + ret = self._eat_identifier(rawtext, brck_limit, pos) +@@ -209,13 +271,13 @@ + # There is an identifier to eat + pos = pos - ret + last_identifier_pos = pos +- # Now, in order to continue the search, we must find a dot. ++ # Now, to continue the search, we must find a dot. + postdot_phase = False + # (the loop continues now) + + elif pos == brck_limit: +- # We are at a bracketing limit. If it is a closing bracket, +- # eat the bracket, otherwise, stop the search. ++ # We are at a bracketing limit. If it is a closing ++ # bracket, eat the bracket, otherwise, stop the search. + level = bracketing[brck_index][1] + while brck_index > 0 and bracketing[brck_index-1][1] > level: + brck_index -= 1 +@@ -244,3 +306,8 @@ + break + + return rawtext[last_identifier_pos:self.indexinrawtext] ++ ++ ++if __name__ == '__main__': ++ import unittest ++ unittest.main('idlelib.idle_test.test_hyperparser', verbosity=2) +diff -r c0e311e010fc Lib/idlelib/IOBinding.py +--- a/Lib/idlelib/IOBinding.py ++++ b/Lib/idlelib/IOBinding.py +@@ -525,16 +525,17 @@ + if self.editwin.flist: + self.editwin.update_recent_files_list(filename) + +-def test(): ++def _io_binding(parent): + root = Tk() ++ root.title("Test IOBinding") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + class MyEditWin: + def __init__(self, text): + self.text = text + self.flist = None + self.text.bind("", self.open) + self.text.bind("", self.save) +- self.text.bind("", self.save_as) +- self.text.bind("", self.save_a_copy) + def get_saved(self): return 0 + def set_saved(self, flag): pass + def reset_undo(self): pass +@@ -542,16 +543,13 @@ + self.text.event_generate("<>") + def save(self, event): + self.text.event_generate("<>") +- def save_as(self, event): +- self.text.event_generate("<>") +- def save_a_copy(self, event): +- self.text.event_generate("<>") ++ + text = Text(root) + text.pack() + text.focus_set() + editwin = MyEditWin(text) + io = IOBinding(editwin) +- root.mainloop() + + if __name__ == "__main__": +- test() ++ from idlelib.idle_test.htest import run ++ run(_io_binding) +diff -r c0e311e010fc Lib/idlelib/IdleHistory.py +--- a/Lib/idlelib/IdleHistory.py ++++ b/Lib/idlelib/IdleHistory.py +@@ -100,7 +100,5 @@ + self.prefix = None + + if __name__ == "__main__": +- from test import support +- support.use_resources = ['gui'] + from unittest import main + main('idlelib.idle_test.test_idlehistory', verbosity=2, exit=False) +diff -r c0e311e010fc Lib/idlelib/MultiCall.py +--- a/Lib/idlelib/MultiCall.py ++++ b/Lib/idlelib/MultiCall.py +@@ -420,9 +420,12 @@ + _multicall_dict[widget] = MultiCall + return MultiCall + +-if __name__ == "__main__": +- # Test ++ ++def _multi_call(parent): + root = tkinter.Tk() ++ root.title("Test MultiCall") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + text = MultiCallCreator(tkinter.Text)(root) + text.pack() + def bindseq(seq, n=[0]): +@@ -438,8 +441,13 @@ + bindseq("") + bindseq("") + bindseq("") ++ bindseq("") + bindseq("") + bindseq("") + bindseq("") + bindseq("") + root.mainloop() ++ ++if __name__ == "__main__": ++ from idlelib.idle_test.htest import run ++ run(_multi_call) +diff -r c0e311e010fc Lib/idlelib/MultiStatusBar.py +--- a/Lib/idlelib/MultiStatusBar.py ++++ b/Lib/idlelib/MultiStatusBar.py +@@ -17,16 +17,29 @@ + label = self.labels[name] + label.config(text=text) + +-def _test(): +- b = Frame() +- c = Text(b) +- c.pack(side=TOP) +- a = MultiStatusBar(b) +- a.set_label("one", "hello") +- a.set_label("two", "world") +- a.pack(side=BOTTOM, fill=X) +- b.pack() +- b.mainloop() ++def _multistatus_bar(parent): ++ root = Tk() ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d" %(x, y + 150)) ++ root.title("Test multistatus bar") ++ frame = Frame(root) ++ text = Text(frame) ++ text.pack() ++ msb = MultiStatusBar(frame) ++ msb.set_label("one", "hello") ++ msb.set_label("two", "world") ++ msb.pack(side=BOTTOM, fill=X) ++ ++ def change(): ++ msb.set_label("one", "foo") ++ msb.set_label("two", "bar") ++ ++ button = Button(root, text="Update status", command=change) ++ button.pack(side=BOTTOM) ++ frame.pack() ++ frame.mainloop() ++ root.mainloop() + + if __name__ == '__main__': +- _test() ++ from idlelib.idle_test.htest import run ++ run(_multistatus_bar) +diff -r c0e311e010fc Lib/idlelib/ObjectBrowser.py +--- a/Lib/idlelib/ObjectBrowser.py ++++ b/Lib/idlelib/ObjectBrowser.py +@@ -9,6 +9,8 @@ + # XXX TO DO: + # - for classes/modules, add "open source" to object browser + ++import re ++ + from idlelib.TreeWidget import TreeItem, TreeNode, ScrolledCanvas + + from reprlib import Repr +@@ -119,12 +121,14 @@ + c = ObjectTreeItem + return c(labeltext, object, setfunction) + +-# Test script + +-def _test(): ++def _object_browser(parent): + import sys + from tkinter import Tk + root = Tk() ++ root.title("Test ObjectBrowser") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + root.configure(bd=0, bg="yellow") + root.focus_set() + sc = ScrolledCanvas(root, bg="white", highlightthickness=0, takefocus=1) +@@ -135,4 +139,5 @@ + root.mainloop() + + if __name__ == '__main__': +- _test() ++ from idlelib.idle_test.htest import run ++ run(_object_browser) +diff -r c0e311e010fc Lib/idlelib/ParenMatch.py +--- a/Lib/idlelib/ParenMatch.py ++++ b/Lib/idlelib/ParenMatch.py +@@ -90,7 +90,8 @@ + self.set_timeout = self.set_timeout_none + + def flash_paren_event(self, event): +- indices = HyperParser(self.editwin, "insert").get_surrounding_brackets() ++ indices = (HyperParser(self.editwin, "insert") ++ .get_surrounding_brackets()) + if indices is None: + self.warn_mismatched() + return +@@ -167,6 +168,11 @@ + # associate a counter with an event; only disable the "paren" + # tag if the event is for the most recent timer. + self.counter += 1 +- self.editwin.text_frame.after(self.FLASH_DELAY, +- lambda self=self, c=self.counter: \ +- self.handle_restore_timer(c)) ++ self.editwin.text_frame.after( ++ self.FLASH_DELAY, ++ lambda self=self, c=self.counter: self.handle_restore_timer(c)) ++ ++ ++if __name__ == '__main__': ++ import unittest ++ unittest.main('idlelib.idle_test.test_parenmatch', verbosity=2) +diff -r c0e311e010fc Lib/idlelib/PathBrowser.py +--- a/Lib/idlelib/PathBrowser.py ++++ b/Lib/idlelib/PathBrowser.py +@@ -4,10 +4,16 @@ + + from idlelib.TreeWidget import TreeItem + from idlelib.ClassBrowser import ClassBrowser, ModuleBrowserTreeItem ++from idlelib.PyShell import PyShellFileList ++ + + class PathBrowser(ClassBrowser): + +- def __init__(self, flist): ++ def __init__(self, flist, _htest=False): ++ """ ++ _htest - bool, change box location when running htest ++ """ ++ self._htest = _htest + self.init(flist) + + def settitle(self): +@@ -87,12 +93,14 @@ + sorted.sort() + return sorted + +-def main(): +- from idlelib import PyShell +- PathBrowser(PyShell.flist) +- if sys.stdin is sys.__stdin__: +- mainloop() ++def _path_browser(parent): ++ flist = PyShellFileList(parent) ++ PathBrowser(flist, _htest=True) ++ parent.mainloop() + + if __name__ == "__main__": + from unittest import main + main('idlelib.idle_test.test_pathbrowser', verbosity=2, exit=False) ++ ++ from idlelib.idle_test.htest import run ++ run(_path_browser) +diff -r c0e311e010fc Lib/idlelib/Percolator.py +--- a/Lib/idlelib/Percolator.py ++++ b/Lib/idlelib/Percolator.py +@@ -51,8 +51,9 @@ + f.setdelegate(filter.delegate) + filter.setdelegate(None) + +-def main(): +- import tkinter as Tk ++def _percolator(parent): ++ import tkinter as tk ++ import re + class Tracer(Delegator): + def __init__(self, name): + self.name = name +@@ -63,22 +64,41 @@ + def delete(self, *args): + print(self.name, ": delete", args) + self.delegate.delete(*args) +- root = Tk.Tk() +- root.wm_protocol("WM_DELETE_WINDOW", root.quit) +- text = Tk.Text() +- text.pack() +- text.focus_set() ++ root = tk.Tk() ++ root.title("Test Percolator") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ text = tk.Text(root) + p = Percolator(text) + t1 = Tracer("t1") + t2 = Tracer("t2") +- p.insertfilter(t1) +- p.insertfilter(t2) +- root.mainloop() # click close widget to continue... +- p.removefilter(t2) +- root.mainloop() +- p.insertfilter(t2) +- p.removefilter(t1) ++ ++ def toggle1(): ++ if var1.get() == 0: ++ var1.set(1) ++ p.insertfilter(t1) ++ elif var1.get() == 1: ++ var1.set(0) ++ p.removefilter(t1) ++ ++ def toggle2(): ++ if var2.get() == 0: ++ var2.set(1) ++ p.insertfilter(t2) ++ elif var2.get() == 1: ++ var2.set(0) ++ p.removefilter(t2) ++ ++ text.pack() ++ var1 = tk.IntVar() ++ cb1 = tk.Checkbutton(root, text="Tracer1", command=toggle1, variable=var1) ++ cb1.pack() ++ var2 = tk.IntVar() ++ cb2 = tk.Checkbutton(root, text="Tracer2", command=toggle2, variable=var2) ++ cb2.pack() ++ + root.mainloop() + + if __name__ == "__main__": +- main() ++ from idlelib.idle_test.htest import run ++ run(_percolator) +diff -r c0e311e010fc Lib/idlelib/PyParse.py +--- a/Lib/idlelib/PyParse.py ++++ b/Lib/idlelib/PyParse.py +@@ -1,5 +1,7 @@ + import re + import sys ++from collections import Mapping ++from functools import partial + + # Reason last stmt is continued (or C_NONE if it's not). + (C_NONE, C_BACKSLASH, C_STRING_FIRST_LINE, +@@ -91,19 +93,48 @@ + [^[\](){}#'"\\]+ + """, re.VERBOSE).match + +-# Build translation table to map uninteresting chars to "x", open +-# brackets to "(", and close brackets to ")". + +-_tran = {} +-for i in range(256): +- _tran[i] = 'x' +-for ch in "({[": +- _tran[ord(ch)] = '(' +-for ch in ")}]": +- _tran[ord(ch)] = ')' +-for ch in "\"'\\\n#": +- _tran[ord(ch)] = ch +-del i, ch ++class StringTranslatePseudoMapping(Mapping): ++ r"""Utility class to be used with str.translate() ++ ++ This Mapping class wraps a given dict. When a value for a key is ++ requested via __getitem__() or get(), the key is looked up in the ++ given dict. If found there, the value from the dict is returned. ++ Otherwise, the default value given upon initialization is returned. ++ ++ This allows using str.translate() to make some replacements, and to ++ replace all characters for which no replacement was specified with ++ a given character instead of leaving them as-is. ++ ++ For example, to replace everything except whitespace with 'x': ++ ++ >>> whitespace_chars = ' \t\n\r' ++ >>> preserve_dict = {ord(c): ord(c) for c in whitespace_chars} ++ >>> mapping = StringTranslatePseudoMapping(preserve_dict, ord('x')) ++ >>> text = "a + b\tc\nd" ++ >>> text.translate(mapping) ++ 'x x x\tx\nx' ++ """ ++ def __init__(self, non_defaults, default_value): ++ self._non_defaults = non_defaults ++ self._default_value = default_value ++ ++ def _get(key, _get=non_defaults.get, _default=default_value): ++ return _get(key, _default) ++ self._get = _get ++ ++ def __getitem__(self, item): ++ return self._get(item) ++ ++ def __len__(self): ++ return len(self._non_defaults) ++ ++ def __iter__(self): ++ return iter(self._non_defaults) ++ ++ def get(self, key, default=None): ++ return self._get(key) ++ + + class Parser: + +@@ -113,19 +144,6 @@ + + def set_str(self, s): + assert len(s) == 0 or s[-1] == '\n' +- if isinstance(s, str): +- # The parse functions have no idea what to do with Unicode, so +- # replace all Unicode characters with "x". This is "safe" +- # so long as the only characters germane to parsing the structure +- # of Python are 7-bit ASCII. It's *necessary* because Unicode +- # strings don't have a .translate() method that supports +- # deletechars. +- uniphooey = s +- s = [] +- push = s.append +- for raw in map(ord, uniphooey): +- push(raw < 127 and chr(raw) or "x") +- s = "".join(s) + self.str = s + self.study_level = 0 + +@@ -197,6 +215,16 @@ + if lo > 0: + self.str = self.str[lo:] + ++ # Build a translation table to map uninteresting chars to 'x', open ++ # brackets to '(', close brackets to ')' while preserving quotes, ++ # backslashes, newlines and hashes. This is to be passed to ++ # str.translate() in _study1(). ++ _tran = {} ++ _tran.update((ord(c), ord('(')) for c in "({[") ++ _tran.update((ord(c), ord(')')) for c in ")}]") ++ _tran.update((ord(c), ord(c)) for c in "\"'\\\n#") ++ _tran = StringTranslatePseudoMapping(_tran, default_value=ord('x')) ++ + # As quickly as humanly possible , find the line numbers (0- + # based) of the non-continuation lines. + # Creates self.{goodlines, continuation}. +@@ -211,7 +239,7 @@ + # uninteresting characters. This can cut the number of chars + # by a factor of 10-40, and so greatly speed the following loop. + str = self.str +- str = str.translate(_tran) ++ str = str.translate(self._tran) + str = str.replace('xxxxxxxx', 'x') + str = str.replace('xxxx', 'x') + str = str.replace('xx', 'x') +diff -r c0e311e010fc Lib/idlelib/ReplaceDialog.py +--- a/Lib/idlelib/ReplaceDialog.py ++++ b/Lib/idlelib/ReplaceDialog.py +@@ -40,7 +40,7 @@ + + def create_entries(self): + SearchDialogBase.create_entries(self) +- self.replent = self.make_entry("Replace with:", self.replvar) ++ self.replent = self.make_entry("Replace with:", self.replvar)[0] + + def create_command_buttons(self): + SearchDialogBase.create_command_buttons(self) +@@ -188,3 +188,34 @@ + def close(self, event=None): + SearchDialogBase.close(self, event) + self.text.tag_remove("hit", "1.0", "end") ++ ++def _replace_dialog(parent): ++ root = Tk() ++ root.title("Test ReplaceDialog") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ ++ # mock undo delegator methods ++ def undo_block_start(): ++ pass ++ ++ def undo_block_stop(): ++ pass ++ ++ text = Text(root) ++ text.undo_block_start = undo_block_start ++ text.undo_block_stop = undo_block_stop ++ text.pack() ++ text.insert("insert","This is a sample string.\n"*10) ++ ++ def show_replace(): ++ text.tag_add(SEL, "1.0", END) ++ replace(text) ++ text.tag_remove(SEL, "1.0", END) ++ ++ button = Button(root, text="Replace", command=show_replace) ++ button.pack() ++ ++if __name__ == '__main__': ++ from idlelib.idle_test.htest import run ++ run(_replace_dialog) +diff -r c0e311e010fc Lib/idlelib/ScrolledList.py +--- a/Lib/idlelib/ScrolledList.py ++++ b/Lib/idlelib/ScrolledList.py +@@ -119,21 +119,22 @@ + pass + + +-def test(): ++def _scrolled_list(parent): + root = Tk() +- root.protocol("WM_DELETE_WINDOW", root.destroy) ++ root.title("Test ScrolledList") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + class MyScrolledList(ScrolledList): +- def fill_menu(self): self.menu.add_command(label="pass") ++ def fill_menu(self): self.menu.add_command(label="right click") + def on_select(self, index): print("select", self.get(index)) + def on_double(self, index): print("double", self.get(index)) +- s = MyScrolledList(root) ++ ++ scrolled_list = MyScrolledList(root) + for i in range(30): +- s.append("item %02d" % i) +- return root ++ scrolled_list.append("Item %02d" % i) + +-def main(): +- root = test() + root.mainloop() + + if __name__ == '__main__': +- main() ++ from idlelib.idle_test.htest import run ++ run(_scrolled_list) +diff -r c0e311e010fc Lib/idlelib/SearchDialog.py +--- a/Lib/idlelib/SearchDialog.py ++++ b/Lib/idlelib/SearchDialog.py +@@ -65,3 +65,25 @@ + if pat: + self.engine.setcookedpat(pat) + return self.find_again(text) ++ ++def _search_dialog(parent): ++ root = Tk() ++ root.title("Test SearchDialog") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ text = Text(root) ++ text.pack() ++ text.insert("insert","This is a sample string.\n"*10) ++ ++ def show_find(): ++ text.tag_add(SEL, "1.0", END) ++ s = _setup(text) ++ s.open(text) ++ text.tag_remove(SEL, "1.0", END) ++ ++ button = Button(root, text="Search", command=show_find) ++ button.pack() ++ ++if __name__ == '__main__': ++ from idlelib.idle_test.htest import run ++ run(_search_dialog) +diff -r c0e311e010fc Lib/idlelib/SearchDialogBase.py +--- a/Lib/idlelib/SearchDialogBase.py ++++ b/Lib/idlelib/SearchDialogBase.py +@@ -1,34 +1,51 @@ + '''Define SearchDialogBase used by Search, Replace, and Grep dialogs.''' +-from tkinter import * ++ ++from tkinter import (Toplevel, Frame, Entry, Label, Button, ++ Checkbutton, Radiobutton) + + class SearchDialogBase: +- '''Create most of a modal search dialog (make_frame, create_widgets). ++ '''Create most of a 3 or 4 row, 3 column search dialog. + +- The wide left column contains: +- 1 or 2 text entry lines (create_entries, make_entry); +- a row of standard radiobuttons (create_option_buttons); +- a row of dialog specific radiobuttons (create_other_buttons). ++ The left and wide middle column contain: ++ 1 or 2 labeled text entry lines (make_entry, create_entries); ++ a row of standard Checkbuttons (make_frame, create_option_buttons), ++ each of which corresponds to a search engine Variable; ++ a row of dialog-specific Check/Radiobuttons (create_other_buttons). + + The narrow right column contains command buttons +- (create_command_buttons, make_button). ++ (make_button, create_command_buttons). + These are bound to functions that execute the command. + +- Except for command buttons, this base class is not limited to +- items common to all three subclasses. Rather, it is the Find dialog +- minus the "Find Next" command and its execution function. +- The other dialogs override methods to replace and add widgets. ++ Except for command buttons, this base class is not limited to items ++ common to all three subclasses. Rather, it is the Find dialog minus ++ the "Find Next" command, its execution function, and the ++ default_command attribute needed in create_widgets. The other ++ dialogs override attributes and methods, the latter to replace and ++ add widgets. + ''' + +- title = "Search Dialog" ++ title = "Search Dialog" # replace in subclasses + icon = "Search" +- needwrapbutton = 1 ++ needwrapbutton = 1 # not in Find in Files + + def __init__(self, root, engine): ++ '''Initialize root, engine, and top attributes. ++ ++ top (level widget): set in create_widgets() called from open(). ++ text (Text searched): set in open(), only used in subclasses(). ++ ent (ry): created in make_entry() called from create_entry(). ++ row (of grid): 0 in create_widgets(), +1 in make_entry/frame(). ++ default_command: set in subclasses, used in create_widgers(). ++ ++ title (of dialog): class attribute, override in subclasses. ++ icon (of dialog): ditto, use unclear if cannot minimize dialog. ++ ''' + self.root = root + self.engine = engine + self.top = None + + def open(self, text, searchphrase=None): ++ "Make dialog visible on top of others and ready to use." + self.text = text + if not self.top: + self.create_widgets() +@@ -44,11 +61,17 @@ + self.top.grab_set() + + def close(self, event=None): ++ "Put dialog away for later use." + if self.top: + self.top.grab_release() + self.top.withdraw() + + def create_widgets(self): ++ '''Create basic 3 row x 3 col search (find) dialog. ++ ++ Other dialogs override subsidiary create_x methods as needed. ++ Replace and Find-in-Files add another entry row. ++ ''' + top = Toplevel(self.root) + top.bind("", self.default_command) + top.bind("", self.close) +@@ -61,29 +84,84 @@ + self.top.grid_columnconfigure(0, pad=2, weight=0) + self.top.grid_columnconfigure(1, pad=2, minsize=100, weight=100) + +- self.create_entries() +- self.create_option_buttons() +- self.create_other_buttons() +- return self.create_command_buttons() ++ self.create_entries() # row 0 (and maybe 1), cols 0, 1 ++ self.create_option_buttons() # next row, cols 0, 1 ++ self.create_other_buttons() # next row, cols 0, 1 ++ self.create_command_buttons() # col 2, all rows + +- def make_entry(self, label, var): +- l = Label(self.top, text=label) +- l.grid(row=self.row, column=0, sticky="nw") +- e = Entry(self.top, textvariable=var, exportselection=0) +- e.grid(row=self.row, column=1, sticky="nwe") ++ def make_entry(self, label_text, var): ++ '''Return (entry, label), . ++ ++ entry - gridded labeled Entry for text entry. ++ label - Label widget, returned for testing. ++ ''' ++ label = Label(self.top, text=label_text) ++ label.grid(row=self.row, column=0, sticky="nw") ++ entry = Entry(self.top, textvariable=var, exportselection=0) ++ entry.grid(row=self.row, column=1, sticky="nwe") + self.row = self.row + 1 +- return e ++ return entry, label ++ ++ def create_entries(self): ++ "Create one or more entry lines with make_entry." ++ self.ent = self.make_entry("Find:", self.engine.patvar)[0] + + def make_frame(self,labeltext=None): ++ '''Return (frame, label). ++ ++ frame - gridded labeled Frame for option or other buttons. ++ label - Label widget, returned for testing. ++ ''' + if labeltext: +- l = Label(self.top, text=labeltext) +- l.grid(row=self.row, column=0, sticky="nw") +- f = Frame(self.top) +- f.grid(row=self.row, column=1, columnspan=1, sticky="nwe") ++ label = Label(self.top, text=labeltext) ++ label.grid(row=self.row, column=0, sticky="nw") ++ else: ++ label = '' ++ frame = Frame(self.top) ++ frame.grid(row=self.row, column=1, columnspan=1, sticky="nwe") + self.row = self.row + 1 +- return f ++ return frame, label ++ ++ def create_option_buttons(self): ++ '''Return (filled frame, options) for testing. ++ ++ Options is a list of SearchEngine booleanvar, label pairs. ++ A gridded frame from make_frame is filled with a Checkbutton ++ for each pair, bound to the var, with the corresponding label. ++ ''' ++ frame = self.make_frame("Options")[0] ++ engine = self.engine ++ options = [(engine.revar, "Regular expression"), ++ (engine.casevar, "Match case"), ++ (engine.wordvar, "Whole word")] ++ if self.needwrapbutton: ++ options.append((engine.wrapvar, "Wrap around")) ++ for var, label in options: ++ btn = Checkbutton(frame, anchor="w", variable=var, text=label) ++ btn.pack(side="left", fill="both") ++ if var.get(): ++ btn.select() ++ return frame, options ++ ++ def create_other_buttons(self): ++ '''Return (frame, others) for testing. ++ ++ Others is a list of value, label pairs. ++ A gridded frame from make_frame is filled with radio buttons. ++ ''' ++ frame = self.make_frame("Direction")[0] ++ var = self.engine.backvar ++ others = [(1, 'Up'), (0, 'Down')] ++ for val, label in others: ++ btn = Radiobutton(frame, anchor="w", ++ variable=var, value=val, text=label) ++ btn.pack(side="left", fill="both") ++ if var.get() == val: ++ btn.select() ++ return frame, others + + def make_button(self, label, command, isdef=0): ++ "Return command button gridded in command frame." + b = Button(self.buttonframe, + text=label, command=command, + default=isdef and "active" or "normal") +@@ -92,66 +170,15 @@ + self.buttonframe.grid(rowspan=rows+1) + return b + +- def create_entries(self): +- self.ent = self.make_entry("Find:", self.engine.patvar) +- +- def create_option_buttons(self): +- f = self.make_frame("Options") +- +- btn = Checkbutton(f, anchor="w", +- variable=self.engine.revar, +- text="Regular expression") +- btn.pack(side="left", fill="both") +- if self.engine.isre(): +- btn.select() +- +- btn = Checkbutton(f, anchor="w", +- variable=self.engine.casevar, +- text="Match case") +- btn.pack(side="left", fill="both") +- if self.engine.iscase(): +- btn.select() +- +- btn = Checkbutton(f, anchor="w", +- variable=self.engine.wordvar, +- text="Whole word") +- btn.pack(side="left", fill="both") +- if self.engine.isword(): +- btn.select() +- +- if self.needwrapbutton: +- btn = Checkbutton(f, anchor="w", +- variable=self.engine.wrapvar, +- text="Wrap around") +- btn.pack(side="left", fill="both") +- if self.engine.iswrap(): +- btn.select() +- +- def create_other_buttons(self): +- f = self.make_frame("Direction") +- +- #lbl = Label(f, text="Direction: ") +- #lbl.pack(side="left") +- +- btn = Radiobutton(f, anchor="w", +- variable=self.engine.backvar, value=1, +- text="Up") +- btn.pack(side="left", fill="both") +- if self.engine.isback(): +- btn.select() +- +- btn = Radiobutton(f, anchor="w", +- variable=self.engine.backvar, value=0, +- text="Down") +- btn.pack(side="left", fill="both") +- if not self.engine.isback(): +- btn.select() +- + def create_command_buttons(self): +- # +- # place button frame on the right ++ "Place buttons in vertical command frame gridded on right." + f = self.buttonframe = Frame(self.top) + f.grid(row=0,column=2,padx=2,pady=2,ipadx=2,ipady=2) + + b = self.make_button("close", self.close) + b.lower() ++ ++if __name__ == '__main__': ++ import unittest ++ unittest.main( ++ 'idlelib.idle_test.test_searchdialogbase', verbosity=2) +diff -r c0e311e010fc Lib/idlelib/SearchEngine.py +--- a/Lib/idlelib/SearchEngine.py ++++ b/Lib/idlelib/SearchEngine.py +@@ -85,7 +85,7 @@ + except re.error as what: + args = what.args + msg = args[0] +- col = arg[1] if len(args) >= 2 else -1 ++ col = args[1] if len(args) >= 2 else -1 + self.report_error(pat, msg, col) + return None + return prog +@@ -229,6 +229,5 @@ + return line, col + + if __name__ == "__main__": +- from test import support; support.use_resources = ['gui'] + import unittest + unittest.main('idlelib.idle_test.test_searchengine', verbosity=2, exit=False) +diff -r c0e311e010fc Lib/idlelib/StackViewer.py +--- a/Lib/idlelib/StackViewer.py ++++ b/Lib/idlelib/StackViewer.py +@@ -1,9 +1,12 @@ + import os + import sys + import linecache ++import re ++import tkinter as tk + + from idlelib.TreeWidget import TreeNode, TreeItem, ScrolledCanvas + from idlelib.ObjectBrowser import ObjectTreeItem, make_objecttreeitem ++from idlelib.PyShell import PyShellFileList + + def StackBrowser(root, flist=None, tb=None, top=None): + if top is None: +@@ -120,3 +123,30 @@ + item = make_objecttreeitem(key + " =", value, setfunction) + sublist.append(item) + return sublist ++ ++def _stack_viewer(parent): ++ root = tk.Tk() ++ root.title("Test StackViewer") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ flist = PyShellFileList(root) ++ try: # to obtain a traceback object ++ intentional_name_error ++ except NameError: ++ exc_type, exc_value, exc_tb = sys.exc_info() ++ ++ # inject stack trace to sys ++ sys.last_type = exc_type ++ sys.last_value = exc_value ++ sys.last_traceback = exc_tb ++ ++ StackBrowser(root, flist=flist, top=root, tb=exc_tb) ++ ++ # restore sys to original state ++ del sys.last_type ++ del sys.last_value ++ del sys.last_traceback ++ ++if __name__ == '__main__': ++ from idlelib.idle_test.htest import run ++ run(_stack_viewer) +diff -r c0e311e010fc Lib/idlelib/ToolTip.py +--- a/Lib/idlelib/ToolTip.py ++++ b/Lib/idlelib/ToolTip.py +@@ -76,14 +76,22 @@ + for item in self.items: + listbox.insert(END, item) + +-def main(): +- # Test code ++def _tooltip(parent): + root = Tk() +- b = Button(root, text="Hello", command=root.destroy) +- b.pack() +- root.update() +- tip = ListboxToolTip(b, ["Hello", "world"]) ++ root.title("Test tooltip") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ label = Label(root, text="Place your mouse over buttons") ++ label.pack() ++ button1 = Button(root, text="Button 1") ++ button2 = Button(root, text="Button 2") ++ button1.pack() ++ button2.pack() ++ ToolTip(button1, "This is tooltip text for button1.") ++ ListboxToolTip(button2, ["This is","multiple line", ++ "tooltip text","for button2"]) + root.mainloop() + + if __name__ == '__main__': +- main() ++ from idlelib.idle_test.htest import run ++ run(_tooltip) +diff -r c0e311e010fc Lib/idlelib/TreeWidget.py +--- a/Lib/idlelib/TreeWidget.py ++++ b/Lib/idlelib/TreeWidget.py +@@ -448,29 +448,18 @@ + return "break" + + +-# Testing functions +- +-def test(): +- from idlelib import PyShell +- root = Toplevel(PyShell.root) +- root.configure(bd=0, bg="yellow") +- root.focus_set() ++def _tree_widget(parent): ++ root = Tk() ++ root.title("Test TreeWidget") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) + sc = ScrolledCanvas(root, bg="white", highlightthickness=0, takefocus=1) +- sc.frame.pack(expand=1, fill="both") +- item = FileTreeItem("C:/windows/desktop") ++ sc.frame.pack(expand=1, fill="both", side=LEFT) ++ item = FileTreeItem(os.getcwd()) + node = TreeNode(sc.canvas, None, item) + node.expand() +- +-def test2(): +- # test w/o scrolling canvas +- root = Tk() +- root.configure(bd=0) +- canvas = Canvas(root, bg="white", highlightthickness=0) +- canvas.pack(expand=1, fill="both") +- item = FileTreeItem(os.curdir) +- node = TreeNode(canvas, None, item) +- node.update() +- canvas.focus_set() ++ root.mainloop() + + if __name__ == '__main__': +- test() ++ from idlelib.idle_test.htest import run ++ run(_tree_widget) +diff -r c0e311e010fc Lib/idlelib/UndoDelegator.py +--- a/Lib/idlelib/UndoDelegator.py ++++ b/Lib/idlelib/UndoDelegator.py +@@ -336,17 +336,30 @@ + self.depth = self.depth + incr + return self.depth + +-def main(): ++def _undo_delegator(parent): + from idlelib.Percolator import Percolator + root = Tk() +- root.wm_protocol("WM_DELETE_WINDOW", root.quit) +- text = Text() ++ root.title("Test UndoDelegator") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ ++ text = Text(root) ++ text.config(height=10) + text.pack() + text.focus_set() + p = Percolator(text) + d = UndoDelegator() + p.insertfilter(d) ++ ++ undo = Button(root, text="Undo", command=lambda:d.undo_event(None)) ++ undo.pack(side='left') ++ redo = Button(root, text="Redo", command=lambda:d.redo_event(None)) ++ redo.pack(side='left') ++ dump = Button(root, text="Dump", command=lambda:d.dump_event(None)) ++ dump.pack(side='left') ++ + root.mainloop() + + if __name__ == "__main__": +- main() ++ from idlelib.idle_test.htest import run ++ run(_undo_delegator) +diff -r c0e311e010fc Lib/idlelib/WidgetRedirector.py +--- a/Lib/idlelib/WidgetRedirector.py ++++ b/Lib/idlelib/WidgetRedirector.py +@@ -1,29 +1,40 @@ +-from tkinter import * ++from tkinter import TclError + + class WidgetRedirector: +- + """Support for redirecting arbitrary widget subcommands. + +- Some Tk operations don't normally pass through Tkinter. For example, if a ++ Some Tk operations don't normally pass through tkinter. For example, if a + character is inserted into a Text widget by pressing a key, a default Tk + binding to the widget's 'insert' operation is activated, and the Tk library +- processes the insert without calling back into Tkinter. ++ processes the insert without calling back into tkinter. + +- Although a binding to could be made via Tkinter, what we really want +- to do is to hook the Tk 'insert' operation itself. ++ Although a binding to could be made via tkinter, what we really want ++ to do is to hook the Tk 'insert' operation itself. For one thing, we want ++ a text.insert call in idle code to have the same effect as a key press. + + When a widget is instantiated, a Tcl command is created whose name is the + same as the pathname widget._w. This command is used to invoke the various + widget operations, e.g. insert (for a Text widget). We are going to hook + this command and provide a facility ('register') to intercept the widget +- operation. ++ operation. We will also intercept method calls on the tkinter class ++ instance that represents the tk widget. + +- In IDLE, the function being registered provides access to the top of a +- Percolator chain. At the bottom of the chain is a call to the original +- Tk widget operation. +- ++ In IDLE, WidgetRedirector is used in Percolator to intercept Text ++ commands. The function being registered provides access to the top ++ of a Percolator chain. At the bottom of the chain is a call to the ++ original Tk widget operation. + """ + def __init__(self, widget): ++ '''Initialize attributes and setup redirection. ++ ++ _operations: dict mapping operation name to new function. ++ widget: the widget whose tcl command is to be intercepted. ++ tk: widget.tk, a convenience attribute, probably not needed. ++ orig: new name of the original tcl command. ++ ++ Since renaming to orig fails with TclError when orig already ++ exists, only one WidgetDirector can exist for a given widget. ++ ''' + self._operations = {} + self.widget = widget # widget instance + self.tk = tk = widget.tk # widget's root +@@ -40,27 +51,45 @@ + self.widget._w) + + def close(self): ++ "Unregister operations and revert redirection created by .__init__." + for operation in list(self._operations): + self.unregister(operation) +- widget = self.widget; del self.widget +- orig = self.orig; del self.orig ++ widget = self.widget + tk = widget.tk + w = widget._w ++ # Restore the original widget Tcl command. + tk.deletecommand(w) +- # restore the original widget Tcl command: +- tk.call("rename", orig, w) ++ tk.call("rename", self.orig, w) ++ del self.widget, self.tk # Should not be needed ++ # if instance is deleted after close, as in Percolator. + + def register(self, operation, function): ++ '''Return OriginalCommand(operation) after registering function. ++ ++ Registration adds an operation: function pair to ._operations. ++ It also adds an widget function attribute that masks the tkinter ++ class instance method. Method masking operates independently ++ from command dispatch. ++ ++ If a second function is registered for the same operation, the ++ first function is replaced in both places. ++ ''' + self._operations[operation] = function + setattr(self.widget, operation, function) + return OriginalCommand(self, operation) + + def unregister(self, operation): ++ '''Return the function for the operation, or None. ++ ++ Deleting the instance attribute unmasks the class attribute. ++ ''' + if operation in self._operations: + function = self._operations[operation] + del self._operations[operation] +- if hasattr(self.widget, operation): ++ try: + delattr(self.widget, operation) ++ except AttributeError: ++ pass + return function + else: + return None +@@ -88,14 +117,29 @@ + + + class OriginalCommand: ++ '''Callable for original tk command that has been redirected. ++ ++ Returned by .register; can be used in the function registered. ++ redir = WidgetRedirector(text) ++ def my_insert(*args): ++ print("insert", args) ++ original_insert(*args) ++ original_insert = redir.register("insert", my_insert) ++ ''' + + def __init__(self, redir, operation): ++ '''Create .tk_call and .orig_and_operation for .__call__ method. ++ ++ .redir and .operation store the input args for __repr__. ++ .tk and .orig copy attributes of .redir (probably not needed). ++ ''' + self.redir = redir + self.operation = operation +- self.tk = redir.tk +- self.orig = redir.orig +- self.tk_call = self.tk.call +- self.orig_and_operation = (self.orig, self.operation) ++ self.tk = redir.tk # redundant with self.redir ++ self.orig = redir.orig # redundant with self.redir ++ # These two could be deleted after checking recipient code. ++ self.tk_call = redir.tk.call ++ self.orig_and_operation = (redir.orig, operation) + + def __repr__(self): + return "OriginalCommand(%r, %r)" % (self.redir, self.operation) +@@ -104,23 +148,27 @@ + return self.tk_call(self.orig_and_operation + args) + + +-def main(): ++def _widget_redirector(parent): # htest # ++ from tkinter import Tk, Text ++ import re ++ + root = Tk() +- root.wm_protocol("WM_DELETE_WINDOW", root.quit) +- text = Text() ++ root.title("Test WidgetRedirector") ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ text = Text(root) + text.pack() + text.focus_set() + redir = WidgetRedirector(text) +- global previous_tcl_fcn + def my_insert(*args): + print("insert", args) +- previous_tcl_fcn(*args) +- previous_tcl_fcn = redir.register("insert", my_insert) ++ original_insert(*args) ++ original_insert = redir.register("insert", my_insert) + root.mainloop() +- redir.unregister("insert") # runs after first 'close window' +- redir.close() +- root.mainloop() +- root.destroy() + + if __name__ == "__main__": +- main() ++ import unittest ++ unittest.main('idlelib.idle_test.test_widgetredir', ++ verbosity=2, exit=False) ++ from idlelib.idle_test.htest import run ++ run(_widget_redirector) +diff -r c0e311e010fc Lib/idlelib/aboutDialog.py +--- a/Lib/idlelib/aboutDialog.py ++++ b/Lib/idlelib/aboutDialog.py +@@ -12,11 +12,16 @@ + """Modal about dialog for idle + + """ +- def __init__(self, parent, title): ++ def __init__(self, parent, title, _htest=False): ++ """ ++ _htest - bool, change box location when running htest ++ """ + Toplevel.__init__(self, parent) + self.configure(borderwidth=5) +- self.geometry("+%d+%d" % (parent.winfo_rootx()+30, +- parent.winfo_rooty()+30)) ++ # place dialog below parent if running htest ++ self.geometry("+%d+%d" % ( ++ parent.winfo_rootx()+30, ++ parent.winfo_rooty()+(30 if not _htest else 100))) + self.bg = "#707070" + self.fg = "#ffffff" + self.CreateWidgets() +diff -r c0e311e010fc Lib/idlelib/config-keys.def +--- a/Lib/idlelib/config-keys.def ++++ b/Lib/idlelib/config-keys.def +@@ -13,37 +13,37 @@ + paste= + beginning-of-line= + center-insert= +-close-all-windows= ++close-all-windows= + close-window= + do-nothing= + end-of-file= + python-docs= + python-context-help= +-history-next= +-history-previous= ++history-next= ++history-previous= + interrupt-execution= + view-restart= + restart-shell= +-open-class-browser= +-open-module= ++open-class-browser= ++open-module= + open-new-window= + open-window-from-file= + plain-newline-and-indent= + print-window= +-redo= ++redo= + remove-selection= +-save-copy-of-window-as-file= +-save-window-as-file= +-save-window= +-select-all= ++save-copy-of-window-as-file= ++save-window-as-file= ++save-window= ++select-all= + toggle-auto-coloring= + undo= + find= +-find-again= ++find-again= + find-in-files= + find-selection= + replace= +-goto-line= ++goto-line= + smart-backspace= + newline-and-indent= + smart-indent= +@@ -53,8 +53,8 @@ + uncomment-region= + tabify-region= + untabify-region= +-toggle-tabs= +-change-indentwidth= ++toggle-tabs= ++change-indentwidth= + del-word-left= + del-word-right= + +diff -r c0e311e010fc Lib/idlelib/configDialog.py +--- a/Lib/idlelib/configDialog.py ++++ b/Lib/idlelib/configDialog.py +@@ -13,7 +13,6 @@ + import tkinter.messagebox as tkMessageBox + import tkinter.colorchooser as tkColorChooser + import tkinter.font as tkFont +-import copy + + from idlelib.configHandler import idleConf + from idlelib.dynOptionMenuWidget import DynOptionMenu +@@ -25,14 +24,20 @@ + + class ConfigDialog(Toplevel): + +- def __init__(self,parent,title): ++ def __init__(self, parent, title, _htest=False, _utest=False): ++ """ ++ _htest - bool, change box location when running htest ++ _utest - bool, don't wait_window when running unittest ++ """ + Toplevel.__init__(self, parent) + self.wm_withdraw() + + self.configure(borderwidth=5) + self.title('IDLE Preferences') ++ if _htest: ++ parent.instance_dict = {} + self.geometry("+%d+%d" % (parent.winfo_rootx()+20, +- parent.winfo_rooty()+30)) ++ parent.winfo_rooty()+(30 if not _htest else 150))) + #Theme Elements. Each theme element key is its display name. + #The first value of the tuple is the sample area tag name. + #The second value is the display name list sort index. +@@ -65,8 +70,9 @@ + self.LoadConfigs() + self.AttachVarCallbacks() #avoid callbacks during LoadConfigs + +- self.wm_deiconify() +- self.wait_window() ++ if not _utest: ++ self.wm_deiconify() ++ self.wait_window() + + def CreateWidgets(self): + self.tabPages = TabbedPageSet(self, +@@ -1140,9 +1146,9 @@ + pass + + if __name__ == '__main__': +- #test the dialog +- root=Tk() +- Button(root,text='Dialog', +- command=lambda:ConfigDialog(root,'Settings')).pack() +- root.instance_dict={} +- root.mainloop() ++ import unittest ++ unittest.main('idlelib.idle_test.test_configdialog', ++ verbosity=2, exit=False) ++ ++ from idlelib.idle_test.htest import run ++ run(ConfigDialog) +diff -r c0e311e010fc Lib/idlelib/configHandler.py +--- a/Lib/idlelib/configHandler.py ++++ b/Lib/idlelib/configHandler.py +@@ -20,7 +20,7 @@ + import os + import sys + +-from configparser import ConfigParser, NoOptionError, NoSectionError ++from configparser import ConfigParser + + class InvalidConfigType(Exception): pass + class InvalidConfigSet(Exception): pass +diff -r c0e311e010fc Lib/idlelib/configHelpSourceEdit.py +--- a/Lib/idlelib/configHelpSourceEdit.py ++++ b/Lib/idlelib/configHelpSourceEdit.py +@@ -8,13 +8,14 @@ + import tkinter.filedialog as tkFileDialog + + class GetHelpSourceDialog(Toplevel): +- def __init__(self, parent, title, menuItem='', filePath=''): ++ def __init__(self, parent, title, menuItem='', filePath='', _htest=False): + """Get menu entry and url/ local file location for Additional Help + + User selects a name for the Help resource and provides a web url + or a local file as its source. The user can enter a url or browse + for the file. + ++ _htest - bool, change box location when running htest + """ + Toplevel.__init__(self, parent) + self.configure(borderwidth=5) +@@ -31,12 +32,14 @@ + self.withdraw() #hide while setting geometry + #needs to be done here so that the winfo_reqwidth is valid + self.update_idletasks() +- #centre dialog over parent: +- self.geometry("+%d+%d" % +- ((parent.winfo_rootx() + ((parent.winfo_width()/2) +- -(self.winfo_reqwidth()/2)), +- parent.winfo_rooty() + ((parent.winfo_height()/2) +- -(self.winfo_reqheight()/2))))) ++ #centre dialog over parent. below parent if running htest. ++ self.geometry( ++ "+%d+%d" % ( ++ parent.winfo_rootx() + ++ (parent.winfo_width()/2 - self.winfo_reqwidth()/2), ++ parent.winfo_rooty() + ++ ((parent.winfo_height()/2 - self.winfo_reqheight()/2) ++ if not _htest else 150))) + self.deiconify() #geometry set, unhide + self.bind('', self.Ok) + self.wait_window() +@@ -159,11 +162,5 @@ + self.destroy() + + if __name__ == '__main__': +- #test the dialog +- root = Tk() +- def run(): +- keySeq = '' +- dlg = GetHelpSourceDialog(root, 'Get Help Source') +- print(dlg.result) +- Button(root,text='Dialog', command=run).pack() +- root.mainloop() ++ from idlelib.idle_test.htest import run ++ run(GetHelpSourceDialog) +diff -r c0e311e010fc Lib/idlelib/dynOptionMenuWidget.py +--- a/Lib/idlelib/dynOptionMenuWidget.py ++++ b/Lib/idlelib/dynOptionMenuWidget.py +@@ -2,9 +2,10 @@ + OptionMenu widget modified to allow dynamic menu reconfiguration + and setting of highlightthickness + """ +-from tkinter import OptionMenu +-from tkinter import _setit ++from tkinter import OptionMenu, _setit, Tk, StringVar, Button ++ + import copy ++import re + + class DynOptionMenu(OptionMenu): + """ +@@ -33,3 +34,24 @@ + command=_setit(self.variable,item,self.command)) + if value: + self.variable.set(value) ++ ++def _dyn_option_menu(parent): ++ root = Tk() ++ root.title("Tets dynamic option menu") ++ var = StringVar(root) ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 150)) ++ var.set("Old option set") #Set the default value ++ dyn = DynOptionMenu(root,var, "old1","old2","old3","old4") ++ dyn.pack() ++ ++ def update(): ++ dyn.SetMenu(["new1","new2","new3","new4"],value="new option set") ++ ++ button = Button(root, text="Change option set", command=update) ++ button.pack() ++ root.mainloop() ++ ++if __name__ == '__main__': ++ from idlelib.idle_test.htest import run ++ run(_dyn_option_menu) +diff -r c0e311e010fc Lib/idlelib/idle_test/README.txt +--- a/Lib/idlelib/idle_test/README.txt ++++ b/Lib/idlelib/idle_test/README.txt +@@ -26,7 +26,6 @@ + with xyz (lowercased) added after 'test_'. + --- + if __name__ == "__main__": +- from test import support; support.use_resources = ['gui'] + import unittest + unittest.main('idlelib.idle_test.test_', verbosity=2, exit=False) + --- +@@ -34,12 +33,12 @@ + + 2. Gui Tests + +-Gui tests need 'requires' and 'use_resources' from test.support +-(test.test_support in 2.7). A test is a gui test if it creates a Tk root or +-master object either directly or indirectly by instantiating a tkinter or +-idle class. For the benefit of buildbot machines that do not have a graphics +-screen, gui tests must be 'guarded' by "requires('gui')" in a setUp +-function or method. This will typically be setUpClass. ++Gui tests need 'requires' from test.support (test.test_support in 2.7). A ++test is a gui test if it creates a Tk root or master object either directly ++or indirectly by instantiating a tkinter or idle class. For the benefit of ++test processes that either have no graphical environment available or are not ++allowed to use it, gui tests must be 'guarded' by "requires('gui')" in a ++setUp function or method. This will typically be setUpClass. + + To avoid interfering with other gui tests, all gui objects must be destroyed + and deleted by the end of the test. If a widget, such as a Tk root, is created +@@ -57,11 +56,17 @@ + del cls.root + --- + +-Support.requires('gui') returns true if it is either called in a main module +-(which never happens on buildbots) or if use_resources contains 'gui'. +-Use_resources is set by test.regrtest but not by unittest. So when running +-tests in another module with unittest, we set it ourselves, as in the xyz.py +-template above. ++Support.requires('gui') causes the test(s) it guards to be skipped if any of ++a few conditions are met: ++ - The tests are being run by regrtest.py, and it was started without ++ enabling the "gui" resource with the "-u" command line option. ++ - The tests are being run on Windows by a service that is not allowed to ++ interact with the graphical environment. ++ - The tests are being run on Mac OSX in a process that cannot make a window ++ manager connection. ++ - tkinter.Tk cannot be successfully instantiated for some reason. ++ - test.support.use_resources has been set by something other than ++ regrtest.py and does not contain "gui". + + Since non-gui tests always run, but gui tests only sometimes, tests of non-gui + operations should best avoid needing a gui. Methods that make incidental use of +@@ -88,8 +93,8 @@ + + To run all idle_test/test_*.py tests, either interactively + ('>>>', with unittest imported) or from a command line, use one of the +-following. (Notes: unittest does not run gui tests; in 2.7, 'test ' (with the +-space) is 'test.regrtest '; where present, -v and -ugui can be omitted.) ++following. (Notes: in 2.7, 'test ' (with the space) is 'test.regrtest '; ++where present, -v and -ugui can be omitted.) + + >>> unittest.main('idlelib.idle_test', verbosity=2, exit=False) + python -m unittest -v idlelib.idle_test +@@ -98,13 +103,13 @@ + + The idle tests are 'discovered' by idlelib.idle_test.__init__.load_tests, + which is also imported into test.test_idle. Normally, neither file should be +-changed when working on individual test modules. The third command runs runs ++changed when working on individual test modules. The third command runs + unittest indirectly through regrtest. The same happens when the entire test + suite is run with 'python -m test'. So that command must work for buildbots + to stay green. Idle tests must not disturb the environment in a way that + makes other tests fail (issue 18081). + + To run an individual Testcase or test method, extend the dotted name given to +-unittest on the command line. (But gui tests will not this way.) ++unittest on the command line. + + python -m unittest -v idlelib.idle_test.test_xyz.Test_case.test_meth +diff -r c0e311e010fc Lib/idlelib/idle_test/htest.py +--- a/Lib/idlelib/idle_test/htest.py ++++ b/Lib/idlelib/idle_test/htest.py +@@ -1,9 +1,12 @@ + '''Run human tests of Idle's window, dialog, and popup widgets. + +-run(test): run *test*, a callable that causes a widget to be displayed. +-runall(): run all tests defined in this file. ++run(*tests) ++Run each callable in tests after finding the matching test spec in this file. ++If there are none, run an htest for each spec dict in this file after finding ++the matching callable in the module named in the spec. + +-Let X be a global name bound to a widget callable. End the module with ++In a tested module, let X be a global name bound to a widget callable. ++End the module with + + if __name__ == '__main__': + +@@ -13,10 +16,10 @@ + The X object must have a .__name__ attribute and a 'parent' parameter. + X will often be a widget class, but a callable instance with .__name__ + or a wrapper function also work. The name of wrapper functions, like +-'_Editor_Window', should start with '_'. ++'_editor_window', should start with '_'. + +-This file must contain a matching instance of the folling template, +-with X.__name__ prepended, as in '_Editor_window_spec ...'. ++This file must contain a matching instance of the following template, ++with X.__name__ prepended, as in '_editor_window_spec ...'. + + _spec = { + 'file': '', +@@ -24,70 +27,343 @@ + 'msg': "" + } + +-file (no .py): used in runall() to import the file and get X. ++file (no .py): used in run() to import the file and get X. + kwds: passed to X (**kwds), after 'parent' is added, to initialize X. + title: an example; used for some widgets, delete if not. + msg: displayed in a master window. Hints as to how the user might + test the widget. Close the window to skip or end the test. ++ ++Modules not being tested at the moment: ++PyShell.PyShellEditorWindow ++Debugger.Debugger ++AutoCompleteWindow.AutoCompleteWindow ++OutputWindow.OutputWindow (indirectly being tested with grep test) + ''' + from importlib import import_module ++from idlelib.macosxSupport import _initializeTkVariantTests + import tkinter as tk + +- +-_Editor_window_spec = { +- 'file': 'EditorWindow', +- 'kwds': {}, +- 'msg': "Test editor functions of interest" +- } +- +-_Help_dialog_spec = { +- 'file': 'EditorWindow', +- 'kwds': {}, +- 'msg': "If the help text displays, this works" +- } +- + AboutDialog_spec = { + 'file': 'aboutDialog', +- 'kwds': {'title': 'About test'}, +- 'msg': "Try each button" ++ 'kwds': {'title': 'aboutDialog test', ++ '_htest': True, ++ }, ++ 'msg': "Test every button. Ensure Python, TK and IDLE versions " ++ "are correctly displayed.\n [Close] to exit.", + } + ++_calltip_window_spec = { ++ 'file': 'CallTipWindow', ++ 'kwds': {}, ++ 'msg': "Typing '(' should display a calltip.\n" ++ "Typing ') should hide the calltip.\n" ++ } ++ ++_class_browser_spec = { ++ 'file': 'ClassBrowser', ++ 'kwds': {}, ++ 'msg': "Inspect names of module, class(with superclass if " ++ "applicable), methods and functions.\nToggle nested items.\n" ++ "Double clicking on items prints a traceback for an exception " ++ "that is ignored." ++ } ++ ++_color_delegator_spec = { ++ 'file': 'ColorDelegator', ++ 'kwds': {}, ++ 'msg': "The text is sample Python code.\n" ++ "Ensure components like comments, keywords, builtins,\n" ++ "string, definitions, and break are correctly colored.\n" ++ "The default color scheme is in idlelib/config-highlight.def" ++ } ++ ++ConfigDialog_spec = { ++ 'file': 'configDialog', ++ 'kwds': {'title': 'Settings', ++ '_htest': True,}, ++ 'msg': "IDLE preferences dialog.\n" ++ "In the 'Fonts/Tabs' tab, changing font face, should update the " ++ "font face of the text in the area below it.\nIn the " ++ "'Highlighting' tab, try different color schemes. Clicking " ++ "items in the sample program should update the choices above it." ++ "\nIn the 'Keys' and 'General' tab, test settings of interest." ++ "\n[Ok] to close the dialog.[Apply] to apply the settings and " ++ "and [Cancel] to revert all changes.\nRe-run the test to ensure " ++ "changes made have persisted." ++ } ++ ++_dyn_option_menu_spec = { ++ 'file': 'dynOptionMenuWidget', ++ 'kwds': {}, ++ 'msg': "Select one of the many options in the 'old option set'.\n" ++ "Click the button to change the option set.\n" ++ "Select one of the many options in the 'new option set'." ++ } ++ ++_editor_window_spec = { ++ 'file': 'EditorWindow', ++ 'kwds': {}, ++ 'msg': "Test editor functions of interest." ++ } + + GetCfgSectionNameDialog_spec = { + 'file': 'configSectionNameDialog', + 'kwds': {'title':'Get Name', +- 'message':'Enter something', +- 'used_names': {'abc'}, +- '_htest': True}, ++ 'message':'Enter something', ++ 'used_names': {'abc'}, ++ '_htest': True}, + 'msg': "After the text entered with [Ok] is stripped, , " +- "'abc', or more that 30 chars are errors.\n" +- "Close 'Get Name' with a valid entry (printed to Shell), [Cancel], or [X]", ++ "'abc', or more that 30 chars are errors.\n" ++ "Close 'Get Name' with a valid entry (printed to Shell), " ++ "[Cancel], or [X]", + } + +-def run(test): +- "Display a widget with callable *test* using a _spec dict" ++GetHelpSourceDialog_spec = { ++ 'file': 'configHelpSourceEdit', ++ 'kwds': {'title': 'Get helpsource', ++ '_htest': True}, ++ 'msg': "Enter menu item name and help file path\n " ++ " and more than 30 chars are invalid menu item names.\n" ++ ", file does not exist are invalid path items.\n" ++ "Test for incomplete web address for help file path.\n" ++ "A valid entry will be printed to shell with [0k].\n" ++ "[Cancel] will print None to shell", ++ } ++ ++# Update once issue21519 is resolved. ++GetKeysDialog_spec = { ++ 'file': 'keybindingDialog', ++ 'kwds': {'title': 'Test keybindings', ++ 'action': 'find-again', ++ 'currentKeySequences': [''] , ++ '_htest': True, ++ }, ++ 'msg': "Test for different key modifier sequences.\n" ++ " is invalid.\n" ++ "No modifier key is invalid.\n" ++ "Shift key with [a-z],[0-9], function key, move key, tab, space" ++ "is invalid.\nNo validitity checking if advanced key binding " ++ "entry is used." ++ } ++ ++_grep_dialog_spec = { ++ 'file': 'GrepDialog', ++ 'kwds': {}, ++ 'msg': "Click the 'Show GrepDialog' button.\n" ++ "Test the various 'Find-in-files' functions.\n" ++ "The results should be displayed in a new '*Output*' window.\n" ++ "'Right-click'->'Goto file/line' anywhere in the search results " ++ "should open that file \nin a new EditorWindow." ++ } ++ ++_help_dialog_spec = { ++ 'file': 'EditorWindow', ++ 'kwds': {}, ++ 'msg': "If the help text displays, this works.\n" ++ "Text is selectable. Window is scrollable." ++ } ++ ++_io_binding_spec = { ++ 'file': 'IOBinding', ++ 'kwds': {}, ++ 'msg': "Test the following bindings\n" ++ " to display open window from file dialog.\n" ++ " to save the file\n" ++ } ++ ++_multi_call_spec = { ++ 'file': 'MultiCall', ++ 'kwds': {}, ++ 'msg': "The following actions should trigger a print to console or IDLE" ++ " Shell.\nEntering and leaving the text area, key entry, " ++ ",\n, , " ++ ", \n, and " ++ "focusing out of the window\nare sequences to be tested." ++ } ++ ++_multistatus_bar_spec = { ++ 'file': 'MultiStatusBar', ++ 'kwds': {}, ++ 'msg': "Ensure presence of multi-status bar below text area.\n" ++ "Click 'Update Status' to change the multi-status text" ++ } ++ ++_object_browser_spec = { ++ 'file': 'ObjectBrowser', ++ 'kwds': {}, ++ 'msg': "Double click on items upto the lowest level.\n" ++ "Attributes of the objects and related information " ++ "will be displayed side-by-side at each level." ++ } ++ ++_path_browser_spec = { ++ 'file': 'PathBrowser', ++ 'kwds': {}, ++ 'msg': "Test for correct display of all paths in sys.path.\n" ++ "Toggle nested items upto the lowest level.\n" ++ "Double clicking on an item prints a traceback\n" ++ "for an exception that is ignored." ++ } ++ ++_percolator_spec = { ++ 'file': 'Percolator', ++ 'kwds': {}, ++ 'msg': "There are two tracers which can be toggled using a checkbox.\n" ++ "Toggling a tracer 'on' by checking it should print tracer" ++ "output to the console or to the IDLE shell.\n" ++ "If both the tracers are 'on', the output from the tracer which " ++ "was switched 'on' later, should be printed first\n" ++ "Test for actions like text entry, and removal." ++ } ++ ++_replace_dialog_spec = { ++ 'file': 'ReplaceDialog', ++ 'kwds': {}, ++ 'msg': "Click the 'Replace' button.\n" ++ "Test various replace options in the 'Replace dialog'.\n" ++ "Click [Close] or [X] to close the 'Replace Dialog'." ++ } ++ ++_search_dialog_spec = { ++ 'file': 'SearchDialog', ++ 'kwds': {}, ++ 'msg': "Click the 'Search' button.\n" ++ "Test various search options in the 'Search dialog'.\n" ++ "Click [Close] or [X] to close the 'Search Dialog'." ++ } ++ ++_scrolled_list_spec = { ++ 'file': 'ScrolledList', ++ 'kwds': {}, ++ 'msg': "You should see a scrollable list of items\n" ++ "Selecting (clicking) or double clicking an item " ++ "prints the name to the console or Idle shell.\n" ++ "Right clicking an item will display a popup." ++ } ++ ++_stack_viewer_spec = { ++ 'file': 'StackViewer', ++ 'kwds': {}, ++ 'msg': "A stacktrace for a NameError exception.\n" ++ "Expand 'idlelib ...' and ''.\n" ++ "Check that exc_value, exc_tb, and exc_type are correct.\n" ++ } ++ ++_tabbed_pages_spec = { ++ 'file': 'tabbedpages', ++ 'kwds': {}, ++ 'msg': "Toggle between the two tabs 'foo' and 'bar'\n" ++ "Add a tab by entering a suitable name for it.\n" ++ "Remove an existing tab by entering its name.\n" ++ "Remove all existing tabs.\n" ++ " is an invalid add page and remove page name.\n" ++ } ++ ++TextViewer_spec = { ++ 'file': 'textView', ++ 'kwds': {'title': 'Test textView', ++ 'text':'The quick brown fox jumps over the lazy dog.\n'*35, ++ '_htest': True}, ++ 'msg': "Test for read-only property of text.\n" ++ "Text is selectable. Window is scrollable.", ++ } ++ ++_tooltip_spec = { ++ 'file': 'ToolTip', ++ 'kwds': {}, ++ 'msg': "Place mouse cursor over both the buttons\n" ++ "A tooltip should appear with some text." ++ } ++ ++_tree_widget_spec = { ++ 'file': 'TreeWidget', ++ 'kwds': {}, ++ 'msg': "The canvas is scrollable.\n" ++ "Click on folders upto to the lowest level." ++ } ++ ++_undo_delegator_spec = { ++ 'file': 'UndoDelegator', ++ 'kwds': {}, ++ 'msg': "Click [Undo] to undo any action.\n" ++ "Click [Redo] to redo any action.\n" ++ "Click [Dump] to dump the current state " ++ "by printing to the console or the IDLE shell.\n" ++ } ++ ++_widget_redirector_spec = { ++ 'file': 'WidgetRedirector', ++ 'kwds': {}, ++ 'msg': "Every text insert should be printed to the console." ++ "or the IDLE shell." ++ } ++ ++def run(*tests): + root = tk.Tk() +- test_spec = globals()[test.__name__ + '_spec'] +- test_kwds = test_spec['kwds'] +- test_kwds['parent'] = root ++ root.title('IDLE htest') ++ root.resizable(0, 0) ++ _initializeTkVariantTests(root) ++ ++ # a scrollable Label like constant width text widget. ++ frameLabel = tk.Frame(root, padx=10) ++ frameLabel.pack() ++ text = tk.Text(frameLabel, wrap='word') ++ text.configure(bg=root.cget('bg'), relief='flat', height=4, width=70) ++ scrollbar = tk.Scrollbar(frameLabel, command=text.yview) ++ text.config(yscrollcommand=scrollbar.set) ++ scrollbar.pack(side='right', fill='y', expand=False) ++ text.pack(side='left', fill='both', expand=True) ++ ++ test_list = [] # List of tuples of the form (spec, callable widget) ++ if tests: ++ for test in tests: ++ test_spec = globals()[test.__name__ + '_spec'] ++ test_spec['name'] = test.__name__ ++ test_list.append((test_spec, test)) ++ else: ++ for k, d in globals().items(): ++ if k.endswith('_spec'): ++ test_name = k[:-5] ++ test_spec = d ++ test_spec['name'] = test_name ++ mod = import_module('idlelib.' + test_spec['file']) ++ test = getattr(mod, test_name) ++ test_list.append((test_spec, test)) ++ ++ test_name = tk.StringVar('') ++ callable_object = None ++ test_kwds = None ++ ++ def next(): ++ ++ nonlocal test_name, callable_object, test_kwds ++ if len(test_list) == 1: ++ next_button.pack_forget() ++ test_spec, callable_object = test_list.pop() ++ test_kwds = test_spec['kwds'] ++ test_kwds['parent'] = root ++ test_name.set('Test ' + test_spec['name']) ++ ++ text.configure(state='normal') # enable text editing ++ text.delete('1.0','end') ++ text.insert("1.0",test_spec['msg']) ++ text.configure(state='disabled') # preserve read-only property + + def run_test(): +- widget = test(**test_kwds) ++ widget = callable_object(**test_kwds) + try: + print(widget.result) + except AttributeError: + pass +- tk.Label(root, text=test_spec['msg'], justify='left').pack() +- tk.Button(root, text='Test ' + test.__name__, command=run_test).pack() ++ ++ button = tk.Button(root, textvariable=test_name, command=run_test) ++ button.pack() ++ next_button = tk.Button(root, text="Next", command=next) ++ next_button.pack() ++ ++ next() ++ + root.mainloop() + +-def runall(): +- "Run all tests. Quick and dirty version." +- for k, d in globals().items(): +- if k.endswith('_spec'): +- mod = import_module('idlelib.' + d['file']) +- test = getattr(mod, k[:-5]) +- run(test) +- + if __name__ == '__main__': +- runall() ++ run() +diff -r c0e311e010fc Lib/idlelib/idle_test/mock_idle.py +--- a/Lib/idlelib/idle_test/mock_idle.py ++++ b/Lib/idlelib/idle_test/mock_idle.py +@@ -5,6 +5,33 @@ + + from idlelib.idle_test.mock_tk import Text + ++class Func: ++ '''Mock function captures args and returns result set by test. ++ ++ Attributes: ++ self.called - records call even if no args, kwds passed. ++ self.result - set by init, returned by call. ++ self.args - captures positional arguments. ++ self.kwds - captures keyword arguments. ++ ++ Most common use will probably be to mock methods. ++ Mock_tk.Var and Mbox_func are special variants of this. ++ ''' ++ def __init__(self, result=None): ++ self.called = False ++ self.result = result ++ self.args = None ++ self.kwds = None ++ def __call__(self, *args, **kwds): ++ self.called = True ++ self.args = args ++ self.kwds = kwds ++ if isinstance(self.result, BaseException): ++ raise self.result ++ else: ++ return self.result ++ ++ + class Editor: + '''Minimally imitate EditorWindow.EditorWindow class. + ''' +@@ -17,6 +44,7 @@ + last = self.text.index('end') + return first, last + ++ + class UndoDelegator: + '''Minimally imitate UndoDelegator,UndoDelegator class. + ''' +diff -r c0e311e010fc Lib/idlelib/idle_test/mock_tk.py +--- a/Lib/idlelib/idle_test/mock_tk.py ++++ b/Lib/idlelib/idle_test/mock_tk.py +@@ -1,9 +1,27 @@ + """Classes that replace tkinter gui objects used by an object being tested. + +-A gui object is anything with a master or parent paramenter, which is typically +-required in spite of what the doc strings say. ++A gui object is anything with a master or parent paramenter, which is ++typically required in spite of what the doc strings say. + """ + ++class Event: ++ '''Minimal mock with attributes for testing event handlers. ++ ++ This is not a gui object, but is used as an argument for callbacks ++ that access attributes of the event passed. If a callback ignores ++ the event, other than the fact that is happened, pass 'event'. ++ ++ Keyboard, mouse, window, and other sources generate Event instances. ++ Event instances have the following attributes: serial (number of ++ event), time (of event), type (of event as number), widget (in which ++ event occurred), and x,y (position of mouse). There are other ++ attributes for specific events, such as keycode for key events. ++ tkinter.Event.__doc__ has more but is still not complete. ++ ''' ++ def __init__(self, **kwds): ++ "Create event with attributes needed for test" ++ self.__dict__.update(kwds) ++ + class Var: + "Use for String/Int/BooleanVar: incomplete" + def __init__(self, master=None, value=None, name=None): +@@ -20,9 +38,10 @@ + + Instead of displaying a message box, the mock's call method saves the + arguments as instance attributes, which test functions can then examime. ++ The test can set the result returned to ask function + """ +- def __init__(self): +- self.result = None # The return for all show funcs ++ def __init__(self, result=None): ++ self.result = result # Return None for all show funcs + def __call__(self, title, message, *args, **kwds): + # Save all args for possible examination by tester + self.title = title +diff -r c0e311e010fc Lib/idlelib/idle_test/test_autocomplete.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_autocomplete.py +@@ -0,0 +1,143 @@ ++import unittest ++from test.support import requires ++from tkinter import Tk, Text, TclError ++ ++import idlelib.AutoComplete as ac ++import idlelib.AutoCompleteWindow as acw ++import idlelib.macosxSupport as mac ++from idlelib.idle_test.mock_idle import Func ++from idlelib.idle_test.mock_tk import Event ++ ++class AutoCompleteWindow: ++ def complete(): ++ return ++ ++class DummyEditwin: ++ def __init__(self, root, text): ++ self.root = root ++ self.text = text ++ self.indentwidth = 8 ++ self.tabwidth = 8 ++ self.context_use_ps1 = True ++ ++ ++class AutoCompleteTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.root = Tk() ++ mac.setupApp(cls.root, None) ++ cls.text = Text(cls.root) ++ cls.editor = DummyEditwin(cls.root, cls.text) ++ ++ @classmethod ++ def tearDownClass(cls): ++ cls.root.destroy() ++ del cls.text ++ del cls.editor ++ del cls.root ++ ++ def setUp(self): ++ self.editor.text.delete('1.0', 'end') ++ self.autocomplete = ac.AutoComplete(self.editor) ++ ++ def test_init(self): ++ self.assertEqual(self.autocomplete.editwin, self.editor) ++ ++ def test_make_autocomplete_window(self): ++ testwin = self.autocomplete._make_autocomplete_window() ++ self.assertIsInstance(testwin, acw.AutoCompleteWindow) ++ ++ def test_remove_autocomplete_window(self): ++ self.autocomplete.autocompletewindow = ( ++ self.autocomplete._make_autocomplete_window()) ++ self.autocomplete._remove_autocomplete_window() ++ self.assertIsNone(self.autocomplete.autocompletewindow) ++ ++ def test_force_open_completions_event(self): ++ # Test that force_open_completions_event calls _open_completions ++ o_cs = Func() ++ self.autocomplete.open_completions = o_cs ++ self.autocomplete.force_open_completions_event('event') ++ self.assertEqual(o_cs.args, (True, False, True)) ++ ++ def test_try_open_completions_event(self): ++ Equal = self.assertEqual ++ autocomplete = self.autocomplete ++ trycompletions = self.autocomplete.try_open_completions_event ++ o_c_l = Func() ++ autocomplete._open_completions_later = o_c_l ++ ++ # _open_completions_later should not be called with no text in editor ++ trycompletions('event') ++ Equal(o_c_l.args, None) ++ ++ # _open_completions_later should be called with COMPLETE_ATTRIBUTES (1) ++ self.text.insert('1.0', 're.') ++ trycompletions('event') ++ Equal(o_c_l.args, (False, False, False, 1)) ++ ++ # _open_completions_later should be called with COMPLETE_FILES (2) ++ self.text.delete('1.0', 'end') ++ self.text.insert('1.0', '"./Lib/') ++ trycompletions('event') ++ Equal(o_c_l.args, (False, False, False, 2)) ++ ++ def test_autocomplete_event(self): ++ Equal = self.assertEqual ++ autocomplete = self.autocomplete ++ ++ # Test that the autocomplete event is ignored if user is pressing a ++ # modifier key in addition to the tab key ++ ev = Event(mc_state=True) ++ self.assertIsNone(autocomplete.autocomplete_event(ev)) ++ del ev.mc_state ++ ++ # If autocomplete window is open, complete() method is called ++ testwin = self.autocomplete._make_autocomplete_window() ++ self.text.insert('1.0', 're.') ++ Equal(self.autocomplete.autocomplete_event(ev), 'break') ++ ++ # If autocomplete window is not active or does not exist, ++ # open_completions is called. Return depends on its return. ++ autocomplete._remove_autocomplete_window() ++ o_cs = Func() # .result = None ++ autocomplete.open_completions = o_cs ++ Equal(self.autocomplete.autocomplete_event(ev), None) ++ Equal(o_cs.args, (False, True, True)) ++ o_cs.result = True ++ Equal(self.autocomplete.autocomplete_event(ev), 'break') ++ Equal(o_cs.args, (False, True, True)) ++ ++ def test_open_completions_later(self): ++ # Test that autocomplete._delayed_completion_id is set ++ pass ++ ++ def test_delayed_open_completions(self): ++ # Test that autocomplete._delayed_completion_id set to None and that ++ # open_completions only called if insertion index is the same as ++ # _delayed_completion_index ++ pass ++ ++ def test_open_completions(self): ++ # Test completions of files and attributes as well as non-completion ++ # of errors ++ pass ++ ++ def test_fetch_completions(self): ++ # Test that fetch_completions returns 2 lists: ++ # For attribute completion, a large list containing all variables, and ++ # a small list containing non-private variables. ++ # For file completion, a large list containing all files in the path, ++ # and a small list containing files that do not start with '.' ++ pass ++ ++ def test_get_entity(self): ++ # Test that a name is in the namespace of sys.modules and ++ # __main__.__dict__ ++ pass ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_autoexpand.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_autoexpand.py +@@ -0,0 +1,141 @@ ++"""Unit tests for idlelib.AutoExpand""" ++import unittest ++from test.support import requires ++from tkinter import Text, Tk ++#from idlelib.idle_test.mock_tk import Text ++from idlelib.AutoExpand import AutoExpand ++ ++ ++class Dummy_Editwin: ++ # AutoExpand.__init__ only needs .text ++ def __init__(self, text): ++ self.text = text ++ ++class AutoExpandTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ if 'tkinter' in str(Text): ++ requires('gui') ++ cls.tk = Tk() ++ cls.text = Text(cls.tk) ++ else: ++ cls.text = Text() ++ cls.auto_expand = AutoExpand(Dummy_Editwin(cls.text)) ++ ++ @classmethod ++ def tearDownClass(cls): ++ if hasattr(cls, 'tk'): ++ cls.tk.destroy() ++ del cls.tk ++ del cls.text, cls.auto_expand ++ ++ def tearDown(self): ++ self.text.delete('1.0', 'end') ++ ++ def test_get_prevword(self): ++ text = self.text ++ previous = self.auto_expand.getprevword ++ equal = self.assertEqual ++ ++ equal(previous(), '') ++ ++ text.insert('insert', 't') ++ equal(previous(), 't') ++ ++ text.insert('insert', 'his') ++ equal(previous(), 'this') ++ ++ text.insert('insert', ' ') ++ equal(previous(), '') ++ ++ text.insert('insert', 'is') ++ equal(previous(), 'is') ++ ++ text.insert('insert', '\nsample\nstring') ++ equal(previous(), 'string') ++ ++ text.delete('3.0', 'insert') ++ equal(previous(), '') ++ ++ text.delete('1.0', 'end') ++ equal(previous(), '') ++ ++ def test_before_only(self): ++ previous = self.auto_expand.getprevword ++ expand = self.auto_expand.expand_word_event ++ equal = self.assertEqual ++ ++ self.text.insert('insert', 'ab ac bx ad ab a') ++ equal(self.auto_expand.getwords(), ['ab', 'ad', 'ac', 'a']) ++ expand('event') ++ equal(previous(), 'ab') ++ expand('event') ++ equal(previous(), 'ad') ++ expand('event') ++ equal(previous(), 'ac') ++ expand('event') ++ equal(previous(), 'a') ++ ++ def test_after_only(self): ++ # Also add punctuation 'noise' that shoud be ignored. ++ text = self.text ++ previous = self.auto_expand.getprevword ++ expand = self.auto_expand.expand_word_event ++ equal = self.assertEqual ++ ++ text.insert('insert', 'a, [ab] ac: () bx"" cd ac= ad ya') ++ text.mark_set('insert', '1.1') ++ equal(self.auto_expand.getwords(), ['ab', 'ac', 'ad', 'a']) ++ expand('event') ++ equal(previous(), 'ab') ++ expand('event') ++ equal(previous(), 'ac') ++ expand('event') ++ equal(previous(), 'ad') ++ expand('event') ++ equal(previous(), 'a') ++ ++ def test_both_before_after(self): ++ text = self.text ++ previous = self.auto_expand.getprevword ++ expand = self.auto_expand.expand_word_event ++ equal = self.assertEqual ++ ++ text.insert('insert', 'ab xy yz\n') ++ text.insert('insert', 'a ac by ac') ++ ++ text.mark_set('insert', '2.1') ++ equal(self.auto_expand.getwords(), ['ab', 'ac', 'a']) ++ expand('event') ++ equal(previous(), 'ab') ++ expand('event') ++ equal(previous(), 'ac') ++ expand('event') ++ equal(previous(), 'a') ++ ++ def test_other_expand_cases(self): ++ text = self.text ++ expand = self.auto_expand.expand_word_event ++ equal = self.assertEqual ++ ++ # no expansion candidate found ++ equal(self.auto_expand.getwords(), []) ++ equal(expand('event'), 'break') ++ ++ text.insert('insert', 'bx cy dz a') ++ equal(self.auto_expand.getwords(), []) ++ ++ # reset state by successfully expanding once ++ # move cursor to another position and expand again ++ text.insert('insert', 'ac xy a ac ad a') ++ text.mark_set('insert', '1.7') ++ expand('event') ++ initial_state = self.auto_expand.state ++ text.mark_set('insert', '1.end') ++ expand('event') ++ new_state = self.auto_expand.state ++ self.assertNotEqual(initial_state, new_state) ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_configdialog.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_configdialog.py +@@ -0,0 +1,32 @@ ++'''Unittests for idlelib/configHandler.py ++ ++Coverage: 46% just by creating dialog. The other half is change code. ++ ++''' ++import unittest ++from test.support import requires ++from tkinter import Tk ++from idlelib.configDialog import ConfigDialog ++from idlelib.macosxSupport import _initializeTkVariantTests ++ ++ ++class ConfigDialogTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.root = Tk() ++ _initializeTkVariantTests(cls.root) ++ ++ @classmethod ++ def tearDownClass(cls): ++ cls.root.destroy() ++ del cls.root ++ ++ def test_dialog(self): ++ d=ConfigDialog(self.root, 'Test', _utest=True) ++ d.destroy() ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_hyperparser.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_hyperparser.py +@@ -0,0 +1,273 @@ ++"""Unittest for idlelib.HyperParser""" ++import unittest ++from test.support import requires ++from tkinter import Tk, Text ++from idlelib.EditorWindow import EditorWindow ++from idlelib.HyperParser import HyperParser ++ ++class DummyEditwin: ++ def __init__(self, text): ++ self.text = text ++ self.indentwidth = 8 ++ self.tabwidth = 8 ++ self.context_use_ps1 = True ++ self.num_context_lines = 50, 500, 1000 ++ ++ _build_char_in_string_func = EditorWindow._build_char_in_string_func ++ is_char_in_string = EditorWindow.is_char_in_string ++ ++ ++class HyperParserTest(unittest.TestCase): ++ code = ( ++ '"""This is a module docstring"""\n' ++ '# this line is a comment\n' ++ 'x = "this is a string"\n' ++ "y = 'this is also a string'\n" ++ 'l = [i for i in range(10)]\n' ++ 'm = [py*py for # comment\n' ++ ' py in l]\n' ++ 'x.__len__\n' ++ "z = ((r'asdf')+('a')))\n" ++ '[x for x in\n' ++ 'for = False\n' ++ 'cliché = "this is a string with unicode, what a cliché"' ++ ) ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.root = Tk() ++ cls.text = Text(cls.root) ++ cls.editwin = DummyEditwin(cls.text) ++ ++ @classmethod ++ def tearDownClass(cls): ++ del cls.text, cls.editwin ++ cls.root.destroy() ++ del cls.root ++ ++ def setUp(self): ++ self.text.insert('insert', self.code) ++ ++ def tearDown(self): ++ self.text.delete('1.0', 'end') ++ self.editwin.context_use_ps1 = True ++ ++ def get_parser(self, index): ++ """ ++ Return a parser object with index at 'index' ++ """ ++ return HyperParser(self.editwin, index) ++ ++ def test_init(self): ++ """ ++ test corner cases in the init method ++ """ ++ with self.assertRaises(ValueError) as ve: ++ self.text.tag_add('console', '1.0', '1.end') ++ p = self.get_parser('1.5') ++ self.assertIn('precedes', str(ve.exception)) ++ ++ # test without ps1 ++ self.editwin.context_use_ps1 = False ++ ++ # number of lines lesser than 50 ++ p = self.get_parser('end') ++ self.assertEqual(p.rawtext, self.text.get('1.0', 'end')) ++ ++ # number of lines greater than 50 ++ self.text.insert('end', self.text.get('1.0', 'end')*4) ++ p = self.get_parser('54.5') ++ ++ def test_is_in_string(self): ++ get = self.get_parser ++ ++ p = get('1.0') ++ self.assertFalse(p.is_in_string()) ++ p = get('1.4') ++ self.assertTrue(p.is_in_string()) ++ p = get('2.3') ++ self.assertFalse(p.is_in_string()) ++ p = get('3.3') ++ self.assertFalse(p.is_in_string()) ++ p = get('3.7') ++ self.assertTrue(p.is_in_string()) ++ p = get('4.6') ++ self.assertTrue(p.is_in_string()) ++ p = get('12.54') ++ self.assertTrue(p.is_in_string()) ++ ++ def test_is_in_code(self): ++ get = self.get_parser ++ ++ p = get('1.0') ++ self.assertTrue(p.is_in_code()) ++ p = get('1.1') ++ self.assertFalse(p.is_in_code()) ++ p = get('2.5') ++ self.assertFalse(p.is_in_code()) ++ p = get('3.4') ++ self.assertTrue(p.is_in_code()) ++ p = get('3.6') ++ self.assertFalse(p.is_in_code()) ++ p = get('4.14') ++ self.assertFalse(p.is_in_code()) ++ ++ def test_get_surrounding_bracket(self): ++ get = self.get_parser ++ ++ def without_mustclose(parser): ++ # a utility function to get surrounding bracket ++ # with mustclose=False ++ return parser.get_surrounding_brackets(mustclose=False) ++ ++ def with_mustclose(parser): ++ # a utility function to get surrounding bracket ++ # with mustclose=True ++ return parser.get_surrounding_brackets(mustclose=True) ++ ++ p = get('3.2') ++ self.assertIsNone(with_mustclose(p)) ++ self.assertIsNone(without_mustclose(p)) ++ ++ p = get('5.6') ++ self.assertTupleEqual(without_mustclose(p), ('5.4', '5.25')) ++ self.assertTupleEqual(without_mustclose(p), with_mustclose(p)) ++ ++ p = get('5.23') ++ self.assertTupleEqual(without_mustclose(p), ('5.21', '5.24')) ++ self.assertTupleEqual(without_mustclose(p), with_mustclose(p)) ++ ++ p = get('6.15') ++ self.assertTupleEqual(without_mustclose(p), ('6.4', '6.end')) ++ self.assertIsNone(with_mustclose(p)) ++ ++ p = get('9.end') ++ self.assertIsNone(with_mustclose(p)) ++ self.assertIsNone(without_mustclose(p)) ++ ++ def test_get_expression(self): ++ get = self.get_parser ++ ++ p = get('4.2') ++ self.assertEqual(p.get_expression(), 'y ') ++ ++ p = get('4.7') ++ with self.assertRaises(ValueError) as ve: ++ p.get_expression() ++ self.assertIn('is inside a code', str(ve.exception)) ++ ++ p = get('5.25') ++ self.assertEqual(p.get_expression(), 'range(10)') ++ ++ p = get('6.7') ++ self.assertEqual(p.get_expression(), 'py') ++ ++ p = get('6.8') ++ self.assertEqual(p.get_expression(), '') ++ ++ p = get('7.9') ++ self.assertEqual(p.get_expression(), 'py') ++ ++ p = get('8.end') ++ self.assertEqual(p.get_expression(), 'x.__len__') ++ ++ p = get('9.13') ++ self.assertEqual(p.get_expression(), "r'asdf'") ++ ++ p = get('9.17') ++ with self.assertRaises(ValueError) as ve: ++ p.get_expression() ++ self.assertIn('is inside a code', str(ve.exception)) ++ ++ p = get('10.0') ++ self.assertEqual(p.get_expression(), '') ++ ++ p = get('10.6') ++ self.assertEqual(p.get_expression(), '') ++ ++ p = get('10.11') ++ self.assertEqual(p.get_expression(), '') ++ ++ p = get('11.3') ++ self.assertEqual(p.get_expression(), '') ++ ++ p = get('11.11') ++ self.assertEqual(p.get_expression(), 'False') ++ ++ p = get('12.6') ++ self.assertEqual(p.get_expression(), 'cliché') ++ ++ def test_eat_identifier(self): ++ def is_valid_id(candidate): ++ result = HyperParser._eat_identifier(candidate, 0, len(candidate)) ++ if result == len(candidate): ++ return True ++ elif result == 0: ++ return False ++ else: ++ err_msg = "Unexpected result: {} (expected 0 or {}".format( ++ result, len(candidate) ++ ) ++ raise Exception(err_msg) ++ ++ # invalid first character which is valid elsewhere in an identifier ++ self.assertFalse(is_valid_id('2notid')) ++ ++ # ASCII-only valid identifiers ++ self.assertTrue(is_valid_id('valid_id')) ++ self.assertTrue(is_valid_id('_valid_id')) ++ self.assertTrue(is_valid_id('valid_id_')) ++ self.assertTrue(is_valid_id('_2valid_id')) ++ ++ # keywords which should be "eaten" ++ self.assertTrue(is_valid_id('True')) ++ self.assertTrue(is_valid_id('False')) ++ self.assertTrue(is_valid_id('None')) ++ ++ # keywords which should not be "eaten" ++ self.assertFalse(is_valid_id('for')) ++ self.assertFalse(is_valid_id('import')) ++ self.assertFalse(is_valid_id('return')) ++ ++ # valid unicode identifiers ++ self.assertTrue(is_valid_id('cliche')) ++ self.assertTrue(is_valid_id('cliché')) ++ self.assertTrue(is_valid_id('a٢')) ++ ++ # invalid unicode identifiers ++ self.assertFalse(is_valid_id('2a')) ++ self.assertFalse(is_valid_id('٢a')) ++ self.assertFalse(is_valid_id('a²')) ++ ++ # valid identifier after "punctuation" ++ self.assertEqual(HyperParser._eat_identifier('+ var', 0, 5), len('var')) ++ self.assertEqual(HyperParser._eat_identifier('+var', 0, 4), len('var')) ++ self.assertEqual(HyperParser._eat_identifier('.var', 0, 4), len('var')) ++ ++ # invalid identifiers ++ self.assertFalse(is_valid_id('+')) ++ self.assertFalse(is_valid_id(' ')) ++ self.assertFalse(is_valid_id(':')) ++ self.assertFalse(is_valid_id('?')) ++ self.assertFalse(is_valid_id('^')) ++ self.assertFalse(is_valid_id('\\')) ++ self.assertFalse(is_valid_id('"')) ++ self.assertFalse(is_valid_id('"a string"')) ++ ++ def test_eat_identifier_various_lengths(self): ++ eat_id = HyperParser._eat_identifier ++ ++ for length in range(1, 21): ++ self.assertEqual(eat_id('a' * length, 0, length), length) ++ self.assertEqual(eat_id('é' * length, 0, length), length) ++ self.assertEqual(eat_id('a' + '2' * (length - 1), 0, length), length) ++ self.assertEqual(eat_id('é' + '2' * (length - 1), 0, length), length) ++ self.assertEqual(eat_id('é' + 'a' * (length - 1), 0, length), length) ++ self.assertEqual(eat_id('é' * (length - 1) + 'a', 0, length), length) ++ self.assertEqual(eat_id('+' * length, 0, length), 0) ++ self.assertEqual(eat_id('2' + 'a' * (length - 1), 0, length), 0) ++ self.assertEqual(eat_id('2' + 'é' * (length - 1), 0, length), 0) ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_parenmatch.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_parenmatch.py +@@ -0,0 +1,109 @@ ++"""Test idlelib.ParenMatch.""" ++# This must currently be a gui test because ParenMatch methods use ++# several text methods not defined on idlelib.idle_test.mock_tk.Text. ++ ++import unittest ++from unittest.mock import Mock ++from test.support import requires ++from tkinter import Tk, Text ++from idlelib.ParenMatch import ParenMatch ++ ++class DummyEditwin: ++ def __init__(self, text): ++ self.text = text ++ self.indentwidth = 8 ++ self.tabwidth = 8 ++ self.context_use_ps1 = True ++ ++ ++class ParenMatchTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.root = Tk() ++ cls.text = Text(cls.root) ++ cls.editwin = DummyEditwin(cls.text) ++ cls.editwin.text_frame = Mock() ++ ++ @classmethod ++ def tearDownClass(cls): ++ del cls.text, cls.editwin ++ cls.root.destroy() ++ del cls.root ++ ++ def tearDown(self): ++ self.text.delete('1.0', 'end') ++ ++ def test_paren_expression(self): ++ """ ++ Test ParenMatch with 'expression' style. ++ """ ++ text = self.text ++ pm = ParenMatch(self.editwin) ++ pm.set_style('expression') ++ ++ text.insert('insert', 'def foobar(a, b') ++ pm.flash_paren_event('event') ++ self.assertIn('<>', text.event_info()) ++ self.assertTupleEqual(text.tag_prevrange('paren', 'end'), ++ ('1.10', '1.15')) ++ text.insert('insert', ')') ++ pm.restore_event() ++ self.assertNotIn('<>', text.event_info()) ++ self.assertEqual(text.tag_prevrange('paren', 'end'), ()) ++ ++ # paren_closed_event can only be tested as below ++ pm.paren_closed_event('event') ++ self.assertTupleEqual(text.tag_prevrange('paren', 'end'), ++ ('1.10', '1.16')) ++ ++ def test_paren_default(self): ++ """ ++ Test ParenMatch with 'default' style. ++ """ ++ text = self.text ++ pm = ParenMatch(self.editwin) ++ pm.set_style('default') ++ ++ text.insert('insert', 'def foobar(a, b') ++ pm.flash_paren_event('event') ++ self.assertIn('<>', text.event_info()) ++ self.assertTupleEqual(text.tag_prevrange('paren', 'end'), ++ ('1.10', '1.11')) ++ text.insert('insert', ')') ++ pm.restore_event() ++ self.assertNotIn('<>', text.event_info()) ++ self.assertEqual(text.tag_prevrange('paren', 'end'), ()) ++ ++ def test_paren_corner(self): ++ """ ++ Test corner cases in flash_paren_event and paren_closed_event. ++ ++ These cases force conditional expression and alternate paths. ++ """ ++ text = self.text ++ pm = ParenMatch(self.editwin) ++ ++ text.insert('insert', '# this is a commen)') ++ self.assertIsNone(pm.paren_closed_event('event')) ++ ++ text.insert('insert', '\ndef') ++ self.assertIsNone(pm.flash_paren_event('event')) ++ self.assertIsNone(pm.paren_closed_event('event')) ++ ++ text.insert('insert', ' a, *arg)') ++ self.assertIsNone(pm.paren_closed_event('event')) ++ ++ def test_handle_restore_timer(self): ++ pm = ParenMatch(self.editwin) ++ pm.restore_event = Mock() ++ pm.handle_restore_timer(0) ++ self.assertTrue(pm.restore_event.called) ++ pm.restore_event.reset_mock() ++ pm.handle_restore_timer(1) ++ self.assertFalse(pm.restore_event.called) ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_searchdialogbase.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_searchdialogbase.py +@@ -0,0 +1,165 @@ ++'''Unittests for idlelib/SearchDialogBase.py ++ ++Coverage: 99%. The only thing not covered is inconsequential -- ++testing skipping of suite when self.needwrapbutton is false. ++ ++''' ++import unittest ++from test.support import requires ++from tkinter import Tk, Toplevel, Frame, Label, BooleanVar, StringVar ++from idlelib import SearchEngine as se ++from idlelib import SearchDialogBase as sdb ++from idlelib.idle_test.mock_idle import Func ++from idlelib.idle_test.mock_tk import Var, Mbox ++ ++# The following could help make some tests gui-free. ++# However, they currently make radiobutton tests fail. ++##def setUpModule(): ++## # Replace tk objects used to initialize se.SearchEngine. ++## se.BooleanVar = Var ++## se.StringVar = Var ++## ++##def tearDownModule(): ++## se.BooleanVar = BooleanVar ++## se.StringVar = StringVar ++ ++class SearchDialogBaseTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.root = Tk() ++ ++ @classmethod ++ def tearDownClass(cls): ++ cls.root.destroy() ++ del cls.root ++ ++ def setUp(self): ++ self.engine = se.SearchEngine(self.root) # None also seems to work ++ self.dialog = sdb.SearchDialogBase(root=self.root, engine=self.engine) ++ ++ def tearDown(self): ++ self.dialog.close() ++ ++ def test_open_and_close(self): ++ # open calls create_widgets, which needs default_command ++ self.dialog.default_command = None ++ ++ # Since text parameter of .open is not used in base class, ++ # pass dummy 'text' instead of tk.Text(). ++ self.dialog.open('text') ++ self.assertEqual(self.dialog.top.state(), 'normal') ++ self.dialog.close() ++ self.assertEqual(self.dialog.top.state(), 'withdrawn') ++ ++ self.dialog.open('text', searchphrase="hello") ++ self.assertEqual(self.dialog.ent.get(), 'hello') ++ self.dialog.close() ++ ++ def test_create_widgets(self): ++ self.dialog.create_entries = Func() ++ self.dialog.create_option_buttons = Func() ++ self.dialog.create_other_buttons = Func() ++ self.dialog.create_command_buttons = Func() ++ ++ self.dialog.default_command = None ++ self.dialog.create_widgets() ++ ++ self.assertTrue(self.dialog.create_entries.called) ++ self.assertTrue(self.dialog.create_option_buttons.called) ++ self.assertTrue(self.dialog.create_other_buttons.called) ++ self.assertTrue(self.dialog.create_command_buttons.called) ++ ++ def test_make_entry(self): ++ equal = self.assertEqual ++ self.dialog.row = 0 ++ self.dialog.top = Toplevel(self.root) ++ entry, label = self.dialog.make_entry("Test:", 'hello') ++ equal(label['text'], 'Test:') ++ ++ self.assertIn(entry.get(), 'hello') ++ egi = entry.grid_info() ++ equal(int(egi['row']), 0) ++ equal(int(egi['column']), 1) ++ equal(int(egi['rowspan']), 1) ++ equal(int(egi['columnspan']), 1) ++ equal(self.dialog.row, 1) ++ ++ def test_create_entries(self): ++ self.dialog.row = 0 ++ self.engine.setpat('hello') ++ self.dialog.create_entries() ++ self.assertIn(self.dialog.ent.get(), 'hello') ++ ++ def test_make_frame(self): ++ self.dialog.row = 0 ++ self.dialog.top = Toplevel(self.root) ++ frame, label = self.dialog.make_frame() ++ self.assertEqual(label, '') ++ self.assertIsInstance(frame, Frame) ++ ++ frame, label = self.dialog.make_frame('testlabel') ++ self.assertEqual(label['text'], 'testlabel') ++ self.assertIsInstance(frame, Frame) ++ ++ def btn_test_setup(self, meth): ++ self.dialog.top = Toplevel(self.root) ++ self.dialog.row = 0 ++ return meth() ++ ++ def test_create_option_buttons(self): ++ e = self.engine ++ for state in (0, 1): ++ for var in (e.revar, e.casevar, e.wordvar, e.wrapvar): ++ var.set(state) ++ frame, options = self.btn_test_setup( ++ self.dialog.create_option_buttons) ++ for spec, button in zip (options, frame.pack_slaves()): ++ var, label = spec ++ self.assertEqual(button['text'], label) ++ self.assertEqual(var.get(), state) ++ if state == 1: ++ button.deselect() ++ else: ++ button.select() ++ self.assertEqual(var.get(), 1 - state) ++ ++ def test_create_other_buttons(self): ++ for state in (False, True): ++ var = self.engine.backvar ++ var.set(state) ++ frame, others = self.btn_test_setup( ++ self.dialog.create_other_buttons) ++ buttons = frame.pack_slaves() ++ for spec, button in zip(others, buttons): ++ val, label = spec ++ self.assertEqual(button['text'], label) ++ if val == state: ++ # hit other button, then this one ++ # indexes depend on button order ++ self.assertEqual(var.get(), state) ++ buttons[val].select() ++ self.assertEqual(var.get(), 1 - state) ++ buttons[1-val].select() ++ self.assertEqual(var.get(), state) ++ ++ def test_make_button(self): ++ self.dialog.top = Toplevel(self.root) ++ self.dialog.buttonframe = Frame(self.dialog.top) ++ btn = self.dialog.make_button('Test', self.dialog.close) ++ self.assertEqual(btn['text'], 'Test') ++ ++ def test_create_command_buttons(self): ++ self.dialog.create_command_buttons() ++ # Look for close button command in buttonframe ++ closebuttoncommand = '' ++ for child in self.dialog.buttonframe.winfo_children(): ++ if child['text'] == 'close': ++ closebuttoncommand = child['command'] ++ self.assertIn('close', closebuttoncommand) ++ ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2, exit=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_textview.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_textview.py +@@ -0,0 +1,97 @@ ++'''Test the functions and main class method of textView.py. ++ ++Since all methods and functions create (or destroy) a TextViewer, which ++is a widget containing multiple widgets, all tests must be gui tests. ++Using mock Text would not change this. Other mocks are used to retrieve ++information about calls. ++ ++The coverage is essentially 100%. ++''' ++from test.support import requires ++requires('gui') ++ ++import unittest ++import os ++from tkinter import Tk ++from idlelib import textView as tv ++from idlelib.idle_test.mock_idle import Func ++from idlelib.idle_test.mock_tk import Mbox ++ ++def setUpModule(): ++ global root ++ root = Tk() ++ ++def tearDownModule(): ++ global root ++ root.destroy() # pyflakes falsely sees root as undefined ++ del root ++ ++ ++class TV(tv.TextViewer): # used by TextViewTest ++ transient = Func() ++ grab_set = Func() ++ wait_window = Func() ++ ++class TextViewTest(unittest.TestCase): ++ ++ def setUp(self): ++ TV.transient.__init__() ++ TV.grab_set.__init__() ++ TV.wait_window.__init__() ++ ++ def test_init_modal(self): ++ view = TV(root, 'Title', 'test text') ++ self.assertTrue(TV.transient.called) ++ self.assertTrue(TV.grab_set.called) ++ self.assertTrue(TV.wait_window.called) ++ view.Ok() ++ ++ def test_init_nonmodal(self): ++ view = TV(root, 'Title', 'test text', modal=False) ++ self.assertFalse(TV.transient.called) ++ self.assertFalse(TV.grab_set.called) ++ self.assertFalse(TV.wait_window.called) ++ view.Ok() ++ ++ def test_ok(self): ++ view = TV(root, 'Title', 'test text', modal=False) ++ view.destroy = Func() ++ view.Ok() ++ self.assertTrue(view.destroy.called) ++ del view.destroy # unmask real function ++ view.destroy ++ ++ ++class textviewTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ cls.orig_mbox = tv.tkMessageBox ++ tv.tkMessageBox = Mbox ++ ++ @classmethod ++ def tearDownClass(cls): ++ tv.tkMessageBox = cls.orig_mbox ++ del cls.orig_mbox ++ ++ def test_view_text(self): ++ # If modal True, tkinter will error with 'can't invoke "event" command' ++ view = tv.view_text(root, 'Title', 'test text', modal=False) ++ self.assertIsInstance(view, tv.TextViewer) ++ ++ def test_view_file(self): ++ test_dir = os.path.dirname(__file__) ++ testfile = os.path.join(test_dir, 'test_textview.py') ++ view = tv.view_file(root, 'Title', testfile, modal=False) ++ self.assertIsInstance(view, tv.TextViewer) ++ self.assertIn('Test', view.textView.get('1.0', '1.end')) ++ view.Ok() ++ ++ # Mock messagebox will be used and view_file will not return anything ++ testfile = os.path.join(test_dir, '../notthere.py') ++ view = tv.view_file(root, 'Title', testfile, modal=False) ++ self.assertIsNone(view) ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/idle_test/test_widgetredir.py +--- /dev/null ++++ b/Lib/idlelib/idle_test/test_widgetredir.py +@@ -0,0 +1,122 @@ ++"""Unittest for idlelib.WidgetRedirector ++ ++100% coverage ++""" ++from test.support import requires ++import unittest ++from idlelib.idle_test.mock_idle import Func ++from tkinter import Tk, Text, TclError ++from idlelib.WidgetRedirector import WidgetRedirector ++ ++ ++class InitCloseTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.tk = Tk() ++ cls.text = Text(cls.tk) ++ ++ @classmethod ++ def tearDownClass(cls): ++ cls.text.destroy() ++ cls.tk.destroy() ++ del cls.text, cls.tk ++ ++ def test_init(self): ++ redir = WidgetRedirector(self.text) ++ self.assertEqual(redir.widget, self.text) ++ self.assertEqual(redir.tk, self.text.tk) ++ self.assertRaises(TclError, WidgetRedirector, self.text) ++ redir.close() # restore self.tk, self.text ++ ++ def test_close(self): ++ redir = WidgetRedirector(self.text) ++ redir.register('insert', Func) ++ redir.close() ++ self.assertEqual(redir._operations, {}) ++ self.assertFalse(hasattr(self.text, 'widget')) ++ ++ ++class WidgetRedirectorTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ requires('gui') ++ cls.tk = Tk() ++ cls.text = Text(cls.tk) ++ ++ @classmethod ++ def tearDownClass(cls): ++ cls.text.destroy() ++ cls.tk.destroy() ++ del cls.text, cls.tk ++ ++ def setUp(self): ++ self.redir = WidgetRedirector(self.text) ++ self.func = Func() ++ self.orig_insert = self.redir.register('insert', self.func) ++ self.text.insert('insert', 'asdf') # leaves self.text empty ++ ++ def tearDown(self): ++ self.text.delete('1.0', 'end') ++ self.redir.close() ++ ++ def test_repr(self): # partly for 100% coverage ++ self.assertIn('Redirector', repr(self.redir)) ++ self.assertIn('Original', repr(self.orig_insert)) ++ ++ def test_register(self): ++ self.assertEqual(self.text.get('1.0', 'end'), '\n') ++ self.assertEqual(self.func.args, ('insert', 'asdf')) ++ self.assertIn('insert', self.redir._operations) ++ self.assertIn('insert', self.text.__dict__) ++ self.assertEqual(self.text.insert, self.func) ++ ++ def test_original_command(self): ++ self.assertEqual(self.orig_insert.operation, 'insert') ++ self.assertEqual(self.orig_insert.tk_call, self.text.tk.call) ++ self.orig_insert('insert', 'asdf') ++ self.assertEqual(self.text.get('1.0', 'end'), 'asdf\n') ++ ++ def test_unregister(self): ++ self.assertIsNone(self.redir.unregister('invalid operation name')) ++ self.assertEqual(self.redir.unregister('insert'), self.func) ++ self.assertNotIn('insert', self.redir._operations) ++ self.assertNotIn('insert', self.text.__dict__) ++ ++ def test_unregister_no_attribute(self): ++ del self.text.insert ++ self.assertEqual(self.redir.unregister('insert'), self.func) ++ ++ def test_dispatch_intercept(self): ++ self.func.__init__(True) ++ self.assertTrue(self.redir.dispatch('insert', False)) ++ self.assertFalse(self.func.args[0]) ++ ++ def test_dispatch_bypass(self): ++ self.orig_insert('insert', 'asdf') ++ # tk.call returns '' where Python would return None ++ self.assertEqual(self.redir.dispatch('delete', '1.0', 'end'), '') ++ self.assertEqual(self.text.get('1.0', 'end'), '\n') ++ ++ def test_dispatch_error(self): ++ self.func.__init__(TclError()) ++ self.assertEqual(self.redir.dispatch('insert', False), '') ++ self.assertEqual(self.redir.dispatch('invalid'), '') ++ ++ def test_command_dispatch(self): ++ # Test that .__init__ causes redirection of tk calls ++ # through redir.dispatch ++ self.tk.call(self.text._w, 'insert', 'hello') ++ self.assertEqual(self.func.args, ('hello',)) ++ self.assertEqual(self.text.get('1.0', 'end'), '\n') ++ # Ensure that called through redir .dispatch and not through ++ # self.text.insert by having mock raise TclError. ++ self.func.__init__(TclError()) ++ self.assertEqual(self.tk.call(self.text._w, 'insert', 'boo'), '') ++ ++ ++ ++if __name__ == '__main__': ++ unittest.main(verbosity=2) +diff -r c0e311e010fc Lib/idlelib/keybindingDialog.py +--- a/Lib/idlelib/keybindingDialog.py ++++ b/Lib/idlelib/keybindingDialog.py +@@ -7,12 +7,13 @@ + import sys + + class GetKeysDialog(Toplevel): +- def __init__(self,parent,title,action,currentKeySequences): ++ def __init__(self,parent,title,action,currentKeySequences,_htest=False): + """ + action - string, the name of the virtual event these keys will be + mapped to + currentKeys - list, a list of all key sequence lists currently mapped + to virtual events, for overlap checking ++ _htest - bool, change box location when running htest + """ + Toplevel.__init__(self, parent) + self.configure(borderwidth=5) +@@ -38,11 +39,14 @@ + self.LoadFinalKeyList() + self.withdraw() #hide while setting geometry + self.update_idletasks() +- self.geometry("+%d+%d" % +- ((parent.winfo_rootx()+((parent.winfo_width()/2) +- -(self.winfo_reqwidth()/2)), +- parent.winfo_rooty()+((parent.winfo_height()/2) +- -(self.winfo_reqheight()/2)) )) ) #centre dialog over parent ++ self.geometry( ++ "+%d+%d" % ( ++ parent.winfo_rootx() + ++ (parent.winfo_width()/2 - self.winfo_reqwidth()/2), ++ parent.winfo_rooty() + ++ ((parent.winfo_height()/2 - self.winfo_reqheight()/2) ++ if not _htest else 150) ++ ) ) #centre dialog over parent (or below htest box) + self.deiconify() #geometry set, unhide + self.wait_window() + +@@ -258,11 +262,5 @@ + return keysOK + + if __name__ == '__main__': +- #test the dialog +- root=Tk() +- def run(): +- keySeq='' +- dlg=GetKeysDialog(root,'Get Keys','find-again',[]) +- print(dlg.result) +- Button(root,text='Dialog',command=run).pack() +- root.mainloop() ++ from idlelib.idle_test.htest import run ++ run(GetKeysDialog) +diff -r c0e311e010fc Lib/idlelib/tabbedpages.py +--- a/Lib/idlelib/tabbedpages.py ++++ b/Lib/idlelib/tabbedpages.py +@@ -467,9 +467,12 @@ + + self._tab_set.set_selected_tab(page_name) + +-if __name__ == '__main__': ++def _tabbed_pages(parent): + # test dialog + root=Tk() ++ width, height, x, y = list(map(int, re.split('[x+]', parent.geometry()))) ++ root.geometry("+%d+%d"%(x, y + 175)) ++ root.title("Test tabbed pages") + tabPage=TabbedPageSet(root, page_names=['Foobar','Baz'], n_rows=0, + expand_tabs=False, + ) +@@ -488,3 +491,8 @@ + labelPgName.pack(padx=5) + entryPgName.pack(padx=5) + root.mainloop() ++ ++ ++if __name__ == '__main__': ++ from idlelib.idle_test.htest import run ++ run(_tabbed_pages) +diff -r c0e311e010fc Lib/idlelib/textView.py +--- a/Lib/idlelib/textView.py ++++ b/Lib/idlelib/textView.py +@@ -9,15 +9,21 @@ + """A simple text viewer dialog for IDLE + + """ +- def __init__(self, parent, title, text, modal=True): ++ def __init__(self, parent, title, text, modal=True, _htest=False): + """Show the given text in a scrollable window with a 'close' button + ++ If modal option set to False, user can interact with other windows, ++ otherwise they will be unable to interact with other windows until ++ the textview window is closed. ++ ++ _htest - bool; change box location when running htest. + """ + Toplevel.__init__(self, parent) + self.configure(borderwidth=5) ++ # place dialog below parent if running htest + self.geometry("=%dx%d+%d+%d" % (625, 500, +- parent.winfo_rootx() + 10, +- parent.winfo_rooty() + 10)) ++ parent.winfo_rootx() + 10, ++ parent.winfo_rooty() + (10 if not _htest else 100))) + #elguavas - config placeholders til config stuff completed + self.bg = '#ffffff' + self.fg = '#000000' +@@ -66,32 +72,15 @@ + try: + with open(filename, 'r', encoding=encoding) as file: + contents = file.read() +- except OSError: +- import tkinter.messagebox as tkMessageBox ++ except IOError: + tkMessageBox.showerror(title='File Load Error', + message='Unable to load file %r .' % filename, + parent=parent) + else: + return view_text(parent, title, contents, modal) + +- + if __name__ == '__main__': +- #test the dialog +- root=Tk() +- root.title('textView test') +- filename = './textView.py' +- with open(filename, 'r') as f: +- text = f.read() +- btn1 = Button(root, text='view_text', +- command=lambda:view_text(root, 'view_text', text)) +- btn1.pack(side=LEFT) +- btn2 = Button(root, text='view_file', +- command=lambda:view_file(root, 'view_file', filename)) +- btn2.pack(side=LEFT) +- btn3 = Button(root, text='nonmodal view_text', +- command=lambda:view_text(root, 'nonmodal view_text', text, +- modal=False)) +- btn3.pack(side=LEFT) +- close = Button(root, text='Close', command=root.destroy) +- close.pack(side=RIGHT) +- root.mainloop() ++ import unittest ++ unittest.main('idlelib.idle_test.test_textview', verbosity=2, exit=False) ++ from idlelib.idle_test.htest import run ++ run(TextViewer) +diff -r c0e311e010fc Lib/inspect.py +--- a/Lib/inspect.py ++++ b/Lib/inspect.py +@@ -1912,6 +1912,10 @@ + pass + else: + if sig is not None: ++ if not isinstance(sig, Signature): ++ raise TypeError( ++ 'unexpected object {!r} in __signature__ ' ++ 'attribute'.format(sig)) + return sig + + try: +diff -r c0e311e010fc Lib/logging/__init__.py +--- a/Lib/logging/__init__.py ++++ b/Lib/logging/__init__.py +@@ -52,34 +52,6 @@ + #--------------------------------------------------------------------------- + + # +-# _srcfile is used when walking the stack to check when we've got the first +-# caller stack frame. +-# +-if hasattr(sys, 'frozen'): #support for py2exe +- _srcfile = "logging%s__init__%s" % (os.sep, __file__[-4:]) +-else: +- _srcfile = __file__ +-_srcfile = os.path.normcase(_srcfile) +- +- +-if hasattr(sys, '_getframe'): +- currentframe = lambda: sys._getframe(3) +-else: #pragma: no cover +- def currentframe(): +- """Return the frame object for the caller's stack frame.""" +- try: +- raise Exception +- except Exception: +- return sys.exc_info()[2].tb_frame.f_back +- +-# _srcfile is only used in conjunction with sys._getframe(). +-# To provide compatibility with older versions of Python, set _srcfile +-# to None if _getframe() is not available; this value will prevent +-# findCaller() from being called. +-#if not hasattr(sys, "_getframe"): +-# _srcfile = None +- +-# + #_startTime is used as the base when calculating the relative time of events + # + _startTime = time.time() +@@ -172,6 +144,40 @@ + finally: + _releaseLock() + ++if hasattr(sys, '_getframe'): ++ currentframe = lambda: sys._getframe(3) ++else: #pragma: no cover ++ def currentframe(): ++ """Return the frame object for the caller's stack frame.""" ++ try: ++ raise Exception ++ except Exception: ++ return sys.exc_info()[2].tb_frame.f_back ++ ++# ++# _srcfile is used when walking the stack to check when we've got the first ++# caller stack frame, by skipping frames whose filename is that of this ++# module's source. It therefore should contain the filename of this module's ++# source file. ++# ++# Ordinarily we would use __file__ for this, but frozen modules don't always ++# have __file__ set, for some reason (see Issue #21736). Thus, we get the ++# filename from a handy code object from a function defined in this module. ++# (There's no particular reason for picking addLevelName.) ++# ++ ++_srcfile = os.path.normcase(addLevelName.__code__.co_filename) ++ ++# _srcfile is only used in conjunction with sys._getframe(). ++# To provide compatibility with older versions of Python, set _srcfile ++# to None if _getframe() is not available; this value will prevent ++# findCaller() from being called. You can also do this if you want to avoid ++# the overhead of fetching caller information, even when _getframe() is ++# available. ++#if not hasattr(sys, '_getframe'): ++# _srcfile = None ++ ++ + def _checkLevel(level): + if isinstance(level, int): + rv = level +diff -r c0e311e010fc Lib/logging/handlers.py +--- a/Lib/logging/handlers.py ++++ b/Lib/logging/handlers.py +@@ -463,6 +463,7 @@ + # we have an open file handle, clean it up + self.stream.flush() + self.stream.close() ++ self.stream = None # See Issue #21742: _open () might fail. + # open a new file handle and get new stat info from that fd + self.stream = self._open() + self._statstream() +diff -r c0e311e010fc Lib/modulefinder.py +--- a/Lib/modulefinder.py ++++ b/Lib/modulefinder.py +@@ -568,11 +568,12 @@ + if isinstance(consts[i], type(co)): + consts[i] = self.replace_paths_in_code(consts[i]) + +- return types.CodeType(co.co_argcount, co.co_nlocals, co.co_stacksize, +- co.co_flags, co.co_code, tuple(consts), co.co_names, +- co.co_varnames, new_filename, co.co_name, +- co.co_firstlineno, co.co_lnotab, +- co.co_freevars, co.co_cellvars) ++ return types.CodeType(co.co_argcount, co.co_kwonlyargcount, ++ co.co_nlocals, co.co_stacksize, co.co_flags, ++ co.co_code, tuple(consts), co.co_names, ++ co.co_varnames, new_filename, co.co_name, ++ co.co_firstlineno, co.co_lnotab, co.co_freevars, ++ co.co_cellvars) + + + def test(): +diff -r c0e311e010fc Lib/multiprocessing/dummy/__init__.py +--- a/Lib/multiprocessing/dummy/__init__.py ++++ b/Lib/multiprocessing/dummy/__init__.py +@@ -104,7 +104,7 @@ + self._value = value + value = property(_get, _set) + def __repr__(self): +- return '<%r(%r, %r)>'%(type(self).__name__,self._typecode,self._value) ++ return '<%s(%r, %r)>'%(type(self).__name__,self._typecode,self._value) + + def Manager(): + return sys.modules[__name__] +diff -r c0e311e010fc Lib/nntplib.py +--- a/Lib/nntplib.py ++++ b/Lib/nntplib.py +@@ -86,7 +86,7 @@ + ] + + # maximal line length when calling readline(). This is to prevent +-# reading arbitrary lenght lines. RFC 3977 limits NNTP line length to ++# reading arbitrary length lines. RFC 3977 limits NNTP line length to + # 512 characters, including CRLF. We have selected 2048 just to be on + # the safe side. + _MAXLINE = 2048 +diff -r c0e311e010fc Lib/os.py +--- a/Lib/os.py ++++ b/Lib/os.py +@@ -1,4 +1,4 @@ +-r"""OS routines for Mac, NT, or Posix depending on what system we're on. ++r"""OS routines for NT or Posix depending on what system we're on. + + This exports: + - all functions from posix, nt or ce, e.g. unlink, stat, etc. +@@ -312,11 +312,12 @@ + + When topdown is true, the caller can modify the dirnames list in-place + (e.g., via del or slice assignment), and walk will only recurse into the +- subdirectories whose names remain in dirnames; this can be used to prune +- the search, or to impose a specific order of visiting. Modifying +- dirnames when topdown is false is ineffective, since the directories in +- dirnames have already been generated by the time dirnames itself is +- generated. ++ subdirectories whose names remain in dirnames; this can be used to prune the ++ search, or to impose a specific order of visiting. Modifying dirnames when ++ topdown is false is ineffective, since the directories in dirnames have ++ already been generated by the time dirnames itself is generated. No matter ++ the value of topdown, the list of subdirectories is retrieved before the ++ tuples for the directory and its subdirectories are generated. + + By default errors from the os.listdir() call are ignored. If + optional arg 'onerror' is specified, it should be a function; it +@@ -344,6 +345,7 @@ + print("bytes in", len(files), "non-directory files") + if 'CVS' in dirs: + dirs.remove('CVS') # don't visit CVS directories ++ + """ + + islink, join, isdir = path.islink, path.join, path.isdir +diff -r c0e311e010fc Lib/pathlib.py +--- a/Lib/pathlib.py ++++ b/Lib/pathlib.py +@@ -749,17 +749,20 @@ + """Return a new path with the file name changed.""" + if not self.name: + raise ValueError("%r has an empty name" % (self,)) ++ drv, root, parts = self._flavour.parse_parts((name,)) ++ if (not name or name[-1] in [self._flavour.sep, self._flavour.altsep] ++ or drv or root or len(parts) != 1): ++ raise ValueError("Invalid name %r" % (name)) + return self._from_parsed_parts(self._drv, self._root, + self._parts[:-1] + [name]) + + def with_suffix(self, suffix): + """Return a new path with the file suffix changed (or added, if none).""" + # XXX if suffix is None, should the current suffix be removed? +- drv, root, parts = self._flavour.parse_parts((suffix,)) +- if drv or root or len(parts) != 1: ++ f = self._flavour ++ if f.sep in suffix or f.altsep and f.altsep in suffix: + raise ValueError("Invalid suffix %r" % (suffix)) +- suffix = parts[0] +- if not suffix.startswith('.'): ++ if suffix and not suffix.startswith('.') or suffix == '.': + raise ValueError("Invalid suffix %r" % (suffix)) + name = self.name + if not name: +diff -r c0e311e010fc Lib/pdb.py +--- a/Lib/pdb.py ++++ b/Lib/pdb.py +@@ -673,7 +673,7 @@ + # now set the break point + err = self.set_break(filename, line, temporary, cond, funcname) + if err: +- self.error(err, file=self.stdout) ++ self.error(err) + else: + bp = self.get_breaks(filename, line)[-1] + self.message("Breakpoint %d at %s:%d" % +diff -r c0e311e010fc Lib/pkgutil.py +--- a/Lib/pkgutil.py ++++ b/Lib/pkgutil.py +@@ -456,6 +456,8 @@ + """ + if module_or_name in sys.modules: + module_or_name = sys.modules[module_or_name] ++ if module_or_name is None: ++ return None + if isinstance(module_or_name, ModuleType): + module = module_or_name + loader = getattr(module, '__loader__', None) +@@ -487,7 +489,7 @@ + # pkgutil previously raised ImportError + msg = "Error while finding loader for {!r} ({}: {})" + raise ImportError(msg.format(fullname, type(ex), ex)) from ex +- return spec.loader ++ return spec.loader if spec is not None else None + + + def extend_path(path, name): +diff -r c0e311e010fc Lib/plistlib.py +--- a/Lib/plistlib.py ++++ b/Lib/plistlib.py +@@ -619,10 +619,7 @@ + offset_table_offset + ) = struct.unpack('>6xBBQQQ', trailer) + self._fp.seek(offset_table_offset) +- offset_format = '>' + _BINARY_FORMAT[offset_size] * num_objects +- self._ref_format = _BINARY_FORMAT[self._ref_size] +- self._object_offsets = struct.unpack( +- offset_format, self._fp.read(offset_size * num_objects)) ++ self._object_offsets = self._read_ints(num_objects, offset_size) + return self._read_object(self._object_offsets[top_object]) + + except (OSError, IndexError, struct.error): +@@ -638,9 +635,16 @@ + + return tokenL + ++ def _read_ints(self, n, size): ++ data = self._fp.read(size * n) ++ if size in _BINARY_FORMAT: ++ return struct.unpack('>' + _BINARY_FORMAT[size] * n, data) ++ else: ++ return tuple(int.from_bytes(data[i: i + size], 'big') ++ for i in range(0, size * n, size)) ++ + def _read_refs(self, n): +- return struct.unpack( +- '>' + self._ref_format * n, self._fp.read(n * self._ref_size)) ++ return self._read_ints(n, self._ref_size) + + def _read_object(self, offset): + """ +@@ -980,18 +984,16 @@ + fp.seek(0) + for info in _FORMATS.values(): + if info['detect'](header): +- p = info['parser']( +- use_builtin_types=use_builtin_types, +- dict_type=dict_type, +- ) ++ P = info['parser'] + break + + else: + raise InvalidFileException() + + else: +- p = _FORMATS[fmt]['parser'](use_builtin_types=use_builtin_types) ++ P = _FORMATS[fmt]['parser'] + ++ p = P(use_builtin_types=use_builtin_types, dict_type=dict_type) + return p.parse(fp) + + +diff -r c0e311e010fc Lib/posixpath.py +--- a/Lib/posixpath.py ++++ b/Lib/posixpath.py +@@ -48,7 +48,6 @@ + + def normcase(s): + """Normalize case of pathname. Has no effect under Posix""" +- # TODO: on Mac OS X, this should really return s.lower(). + if not isinstance(s, (bytes, str)): + raise TypeError("normcase() argument must be str or bytes, " + "not '{}'".format(s.__class__.__name__)) +diff -r c0e311e010fc Lib/pydoc.py +--- a/Lib/pydoc.py ++++ b/Lib/pydoc.py +@@ -64,6 +64,7 @@ + import sys + import time + import tokenize ++import urllib.parse + import warnings + from collections import deque + from reprlib import Repr +@@ -595,10 +596,15 @@ + elif pep: + url = 'http://www.python.org/dev/peps/pep-%04d/' % int(pep) + results.append('%s' % (url, escape(all))) ++ elif selfdot: ++ # Create a link for methods like 'self.method(...)' ++ # and use for attributes like 'self.attr' ++ if text[end:end+1] == '(': ++ results.append('self.' + self.namelink(name, methods)) ++ else: ++ results.append('self.%s' % name) + elif text[end:end+1] == '(': + results.append(self.namelink(name, methods, funcs, classes)) +- elif selfdot: +- results.append('self.%s' % name) + else: + results.append(self.namelink(name, classes)) + here = end +@@ -643,10 +649,7 @@ + head = '%s' % linkedname + try: + path = inspect.getabsfile(object) +- url = path +- if sys.platform == 'win32': +- import nturl2path +- url = nturl2path.pathname2url(path) ++ url = urllib.parse.quote(path) + filelink = self.filelink(url, path) + except TypeError: + filelink = '(built-in)' +@@ -1412,6 +1415,8 @@ + + def getpager(): + """Decide what method to use for paging through text.""" ++ if not hasattr(sys.stdin, "isatty"): ++ return plainpager + if not hasattr(sys.stdout, "isatty"): + return plainpager + if not sys.stdin.isatty() or not sys.stdout.isatty(): +@@ -1733,7 +1738,6 @@ + 'TRACEBACKS': 'TYPES', + 'NONE': ('bltin-null-object', ''), + 'ELLIPSIS': ('bltin-ellipsis-object', 'SLICINGS'), +- 'FILES': ('bltin-file-objects', ''), + 'SPECIALATTRIBUTES': ('specialattrs', ''), + 'CLASSES': ('types', 'class SPECIALMETHODS PRIVATENAMES'), + 'MODULES': ('typesmodules', 'import'), +@@ -2347,7 +2351,7 @@ + + def html_getfile(path): + """Get and display a source file listing safely.""" +- path = path.replace('%20', ' ') ++ path = urllib.parse.unquote(path) + with tokenize.open(path) as fp: + lines = html.escape(fp.read()) + body = '
%s
' % lines +diff -r c0e311e010fc Lib/quopri.py +--- a/Lib/quopri.py ++++ b/Lib/quopri.py +@@ -44,13 +44,11 @@ + def encode(input, output, quotetabs, header=False): + """Read 'input', apply quoted-printable encoding, and write to 'output'. + +- 'input' and 'output' are files with readline() and write() methods. +- The 'quotetabs' flag indicates whether embedded tabs and spaces should be +- quoted. Note that line-ending tabs and spaces are always encoded, as per +- RFC 1521. +- The 'header' flag indicates whether we are encoding spaces as _ as per +- RFC 1522. +- """ ++ 'input' and 'output' are binary file objects. The 'quotetabs' flag ++ indicates whether embedded tabs and spaces should be quoted. Note that ++ line-ending tabs and spaces are always encoded, as per RFC 1521. ++ The 'header' flag indicates whether we are encoding spaces as _ as per RFC ++ 1522.""" + + if b2a_qp is not None: + data = input.read() +@@ -118,7 +116,7 @@ + + def decode(input, output, header=False): + """Read 'input', apply quoted-printable decoding, and write to 'output'. +- 'input' and 'output' are files with readline() and write() methods. ++ 'input' and 'output' are binary file objects. + If 'header' is true, decode underscore as space (per RFC 1522).""" + + if a2b_qp is not None: +diff -r c0e311e010fc Lib/random.py +--- a/Lib/random.py ++++ b/Lib/random.py +@@ -355,7 +355,10 @@ + + """ + u = self.random() +- c = 0.5 if mode is None else (mode - low) / (high - low) ++ try: ++ c = 0.5 if mode is None else (mode - low) / (high - low) ++ except ZeroDivisionError: ++ return low + if u > c: + u = 1.0 - u + c = 1.0 - c +diff -r c0e311e010fc Lib/site.py +--- a/Lib/site.py ++++ b/Lib/site.py +@@ -373,7 +373,7 @@ + dirs.extend([os.path.join(here, os.pardir), here, os.curdir]) + builtins.license = _sitebuiltins._Printer( + "license", +- "See http://www.python.org/download/releases/%.5s/license" % sys.version, ++ "See http://www.python.org/psf/license/", + files, dirs) + + +diff -r c0e311e010fc Lib/smtplib.py +--- a/Lib/smtplib.py ++++ b/Lib/smtplib.py +@@ -377,6 +377,7 @@ + if self.debuglevel > 0: + print('reply:', repr(line), file=stderr) + if len(line) > _MAXLINE: ++ self.close() + raise SMTPResponseException(500, "Line too long.") + resp.append(line[4:].strip(b' \t\r\n')) + code = line[:3] +diff -r c0e311e010fc Lib/socketserver.py +--- a/Lib/socketserver.py ++++ b/Lib/socketserver.py +@@ -523,35 +523,39 @@ + + def collect_children(self): + """Internal routine to wait for children that have exited.""" +- if self.active_children is None: return ++ if self.active_children is None: ++ return ++ ++ # If we're above the max number of children, wait and reap them until ++ # we go back below threshold. Note that we use waitpid(-1) below to be ++ # able to collect children in size() syscalls instead ++ # of size(): the downside is that this might reap children ++ # which we didn't spawn, which is why we only resort to this when we're ++ # above max_children. + while len(self.active_children) >= self.max_children: +- # XXX: This will wait for any child process, not just ones +- # spawned by this library. This could confuse other +- # libraries that expect to be able to wait for their own +- # children. + try: +- pid, status = os.waitpid(0, 0) ++ pid, _ = os.waitpid(-1, 0) ++ self.active_children.discard(pid) ++ except InterruptedError: ++ pass ++ except ChildProcessError: ++ # we don't have any children, we're done ++ self.active_children.clear() + except OSError: +- pid = None +- if pid not in self.active_children: continue +- self.active_children.remove(pid) ++ break + +- # XXX: This loop runs more system calls than it ought +- # to. There should be a way to put the active_children into a +- # process group and then use os.waitpid(-pgid) to wait for any +- # of that set, but I couldn't find a way to allocate pgids +- # that couldn't collide. +- for child in self.active_children: ++ # Now reap all defunct children. ++ for pid in self.active_children.copy(): + try: +- pid, status = os.waitpid(child, os.WNOHANG) ++ pid, _ = os.waitpid(pid, os.WNOHANG) ++ # if the child hasn't exited yet, pid will be 0 and ignored by ++ # discard() below ++ self.active_children.discard(pid) ++ except ChildProcessError: ++ # someone else reaped it ++ self.active_children.discard(pid) + except OSError: +- pid = None +- if not pid: continue +- try: +- self.active_children.remove(pid) +- except ValueError as e: +- raise ValueError('%s. x=%d and list=%r' % (e.message, pid, +- self.active_children)) ++ pass + + def handle_timeout(self): + """Wait for zombies after self.timeout seconds of inactivity. +@@ -573,8 +577,8 @@ + if pid: + # Parent process + if self.active_children is None: +- self.active_children = [] +- self.active_children.append(pid) ++ self.active_children = set() ++ self.active_children.add(pid) + self.close_request(request) + return + else: +diff -r c0e311e010fc Lib/sqlite3/dbapi2.py +--- a/Lib/sqlite3/dbapi2.py ++++ b/Lib/sqlite3/dbapi2.py +@@ -22,6 +22,7 @@ + + import datetime + import time ++import collections.abc + + from _sqlite3 import * + +@@ -50,6 +51,7 @@ + sqlite_version_info = tuple([int(x) for x in sqlite_version.split(".")]) + + Binary = memoryview ++collections.abc.Sequence.register(Row) + + def register_adapters_and_converters(): + def adapt_date(val): +diff -r c0e311e010fc Lib/sqlite3/test/factory.py +--- a/Lib/sqlite3/test/factory.py ++++ b/Lib/sqlite3/test/factory.py +@@ -23,6 +23,7 @@ + + import unittest + import sqlite3 as sqlite ++from collections.abc import Sequence + + class MyConnection(sqlite.Connection): + def __init__(self, *args, **kwargs): +@@ -96,9 +97,19 @@ + self.assertEqual(col1, 1, "by name: wrong result for column 'A'") + self.assertEqual(col2, 2, "by name: wrong result for column 'B'") + +- col1, col2 = row[0], row[1] +- self.assertEqual(col1, 1, "by index: wrong result for column 0") +- self.assertEqual(col2, 2, "by index: wrong result for column 1") ++ self.assertEqual(row[0], 1, "by index: wrong result for column 0") ++ self.assertEqual(row[1], 2, "by index: wrong result for column 1") ++ self.assertEqual(row[-1], 2, "by index: wrong result for column -1") ++ self.assertEqual(row[-2], 1, "by index: wrong result for column -2") ++ ++ with self.assertRaises(IndexError): ++ row['c'] ++ with self.assertRaises(IndexError): ++ row[2] ++ with self.assertRaises(IndexError): ++ row[-3] ++ with self.assertRaises(IndexError): ++ row[2**1000] + + def CheckSqliteRowIter(self): + """Checks if the row object is iterable""" +@@ -142,6 +153,15 @@ + self.assertNotEqual(row_1, row_3) + self.assertNotEqual(hash(row_1), hash(row_3)) + ++ def CheckSqliteRowAsSequence(self): ++ """ Checks if the row object can act like a sequence """ ++ self.con.row_factory = sqlite.Row ++ row = self.con.execute("select 1 as a, 2 as b").fetchone() ++ ++ as_tuple = tuple(row) ++ self.assertEqual(list(reversed(row)), list(reversed(as_tuple))) ++ self.assertIsInstance(row, Sequence) ++ + def tearDown(self): + self.con.close() + +diff -r c0e311e010fc Lib/sre_parse.py +--- a/Lib/sre_parse.py ++++ b/Lib/sre_parse.py +@@ -616,7 +616,8 @@ + "%r" % name) + gid = state.groupdict.get(name) + if gid is None: +- raise error("unknown group name") ++ msg = "unknown group name: {0!r}".format(name) ++ raise error(msg) + subpatternappend((GROUPREF, gid)) + continue + else: +@@ -669,7 +670,8 @@ + if condname.isidentifier(): + condgroup = state.groupdict.get(condname) + if condgroup is None: +- raise error("unknown group name") ++ msg = "unknown group name: {0!r}".format(condname) ++ raise error(msg) + else: + try: + condgroup = int(condname) +@@ -806,7 +808,8 @@ + try: + index = pattern.groupindex[name] + except KeyError: +- raise IndexError("unknown group name") ++ msg = "unknown group name: {0!r}".format(name) ++ raise IndexError(msg) + addgroup(index) + elif c == "0": + if s.next in OCTDIGITS: +diff -r c0e311e010fc Lib/tarfile.py +--- a/Lib/tarfile.py ++++ b/Lib/tarfile.py +@@ -1423,7 +1423,8 @@ + fileobj = bltn_open(name, self._mode) + self._extfileobj = False + else: +- if name is None and hasattr(fileobj, "name"): ++ if (name is None and hasattr(fileobj, "name") and ++ isinstance(fileobj.name, (str, bytes))): + name = fileobj.name + if hasattr(fileobj, "mode"): + self._mode = fileobj.mode +diff -r c0e311e010fc Lib/test/datetimetester.py +--- a/Lib/test/datetimetester.py ++++ b/Lib/test/datetimetester.py +@@ -5,6 +5,7 @@ + + import sys + import pickle ++import random + import unittest + + from operator import lt, le, gt, ge, eq, ne, truediv, floordiv, mod +@@ -76,8 +77,18 @@ + def __init__(self, offset=None, name=None, dstoffset=None): + FixedOffset.__init__(self, offset, name, dstoffset) + ++class _TZInfo(tzinfo): ++ def utcoffset(self, datetime_module): ++ return random.random() ++ + class TestTZInfo(unittest.TestCase): + ++ def test_refcnt_crash_bug_22044(self): ++ tz1 = _TZInfo() ++ dt1 = datetime(2014, 7, 21, 11, 32, 3, 0, tz1) ++ with self.assertRaises(TypeError): ++ dt1.utcoffset() ++ + def test_non_abstractness(self): + # In order to allow subclasses to get pickled, the C implementation + # wasn't able to get away with having __init__ raise +diff -r c0e311e010fc Lib/test/multibytecodec_support.py +--- a/Lib/test/multibytecodec_support.py ++++ b/Lib/test/multibytecodec_support.py +@@ -277,8 +277,7 @@ + supmaps = [] + codectests = [] + +- def __init__(self, *args, **kw): +- unittest.TestCase.__init__(self, *args, **kw) ++ def setUp(self): + try: + self.open_mapping_file().close() # test it to report the error early + except (OSError, HTTPException): +diff -r c0e311e010fc Lib/test/pydoc_mod.py +--- a/Lib/test/pydoc_mod.py ++++ b/Lib/test/pydoc_mod.py +@@ -15,6 +15,16 @@ + NO_MEANING = "eggs" + pass + ++class C(object): ++ def say_no(self): ++ return "no" ++ def get_answer(self): ++ """ Return say_no() """ ++ return self.say_no() ++ def is_it_true(self): ++ """ Return self.get_answer() """ ++ return self.get_answer() ++ + def doc_func(): + """ + This function solves all of the world's problems: +diff -r c0e311e010fc Lib/test/pystone.py +--- a/Lib/test/pystone.py ++++ b/Lib/test/pystone.py +@@ -3,7 +3,7 @@ + """ + "PYSTONE" Benchmark Program + +-Version: Python/1.1 (corresponds to C/1.1 plus 2 Pystone fixes) ++Version: Python/1.2 (corresponds to C/1.1 plus 3 Pystone fixes) + + Author: Reinhold P. Weicker, CACM Vol 27, No 10, 10/84 pg. 1013. + +@@ -30,13 +30,20 @@ + percent faster than version 1.0, so benchmark figures + of different versions can't be compared directly. + ++ Version 1.2 changes the division to floor division. ++ ++ Under Python 3 version 1.1 would use the normal division ++ operator, resulting in some of the operations mistakenly ++ yielding floats. Version 1.2 instead uses floor division ++ making the benchmark a integer benchmark again. ++ + """ + + LOOPS = 50000 + + from time import clock + +-__version__ = "1.1" ++__version__ = "1.2" + + [Ident1, Ident2, Ident3, Ident4, Ident5] = range(1, 6) + +@@ -123,7 +130,7 @@ + EnumLoc = Proc6(Ident1) + CharIndex = chr(ord(CharIndex)+1) + IntLoc3 = IntLoc2 * IntLoc1 +- IntLoc2 = IntLoc3 / IntLoc1 ++ IntLoc2 = IntLoc3 // IntLoc1 + IntLoc2 = 7 * (IntLoc3 - IntLoc2) - IntLoc1 + IntLoc1 = Proc2(IntLoc1) + +diff -r c0e311e010fc Lib/test/script_helper.py +--- a/Lib/test/script_helper.py ++++ b/Lib/test/script_helper.py +@@ -155,8 +155,8 @@ + script_name = make_script(zip_dir, script_basename, source) + unlink.append(script_name) + if compiled: +- init_name = py_compile(init_name, doraise=True) +- script_name = py_compile(script_name, doraise=True) ++ init_name = py_compile.compile(init_name, doraise=True) ++ script_name = py_compile.compile(script_name, doraise=True) + unlink.extend((init_name, script_name)) + pkg_names = [os.sep.join([pkg_name]*i) for i in range(1, depth+1)] + script_name_in_zip = os.path.join(pkg_names[-1], os.path.basename(script_name)) +diff -r c0e311e010fc Lib/test/support/__init__.py +--- a/Lib/test/support/__init__.py ++++ b/Lib/test/support/__init__.py +@@ -3,28 +3,29 @@ + if __name__ != 'test.support': + raise ImportError('support must be imported from the test package') + ++import collections.abc + import contextlib + import errno ++import fnmatch + import functools + import gc +-import socket +-import sys ++import importlib ++import importlib.util ++import logging.handlers + import os + import platform ++import re + import shutil ++import socket ++import stat ++import struct ++import subprocess ++import sys ++import sysconfig ++import tempfile ++import time ++import unittest + import warnings +-import unittest +-import importlib +-import importlib.util +-import collections.abc +-import re +-import subprocess +-import time +-import sysconfig +-import fnmatch +-import logging.handlers +-import struct +-import tempfile + + try: + import _thread, threading +@@ -84,7 +85,7 @@ + "skip_unless_symlink", "requires_gzip", "requires_bz2", "requires_lzma", + "bigmemtest", "bigaddrspacetest", "cpython_only", "get_attribute", + "requires_IEEE_754", "skip_unless_xattr", "requires_zlib", +- "anticipate_failure", ++ "anticipate_failure", "load_package_tests", + # sys + "is_jython", "check_impl_detail", + # network +@@ -187,6 +188,25 @@ + return unittest.expectedFailure + return lambda f: f + ++def load_package_tests(pkg_dir, loader, standard_tests, pattern): ++ """Generic load_tests implementation for simple test packages. ++ ++ Most packages can implement load_tests using this function as follows: ++ ++ def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) ++ """ ++ if pattern is None: ++ pattern = "test*" ++ top_dir = os.path.dirname( # Lib ++ os.path.dirname( # test ++ os.path.dirname(__file__))) # support ++ package_tests = loader.discover(start_dir=pkg_dir, ++ top_level_dir=top_dir, ++ pattern=pattern) ++ standard_tests.addTests(package_tests) ++ return standard_tests ++ + + def import_fresh_module(name, fresh=(), blocked=(), deprecated=False): + """Import and return a module, deliberately bypassing sys.modules. +@@ -316,7 +336,13 @@ + def _rmtree_inner(path): + for name in os.listdir(path): + fullname = os.path.join(path, name) +- if os.path.isdir(fullname): ++ try: ++ mode = os.lstat(fullname).st_mode ++ except OSError as exc: ++ print("support.rmtree(): os.lstat(%r) failed with %s" % (fullname, exc), ++ file=sys.__stderr__) ++ mode = 0 ++ if stat.S_ISDIR(mode): + _waitfor(_rmtree_inner, fullname, waitall=True) + os.rmdir(fullname) + else: +@@ -454,23 +480,17 @@ + return _is_gui_available.result + + def is_resource_enabled(resource): +- """Test whether a resource is enabled. Known resources are set by +- regrtest.py.""" +- return use_resources is not None and resource in use_resources ++ """Test whether a resource is enabled. ++ ++ Known resources are set by regrtest.py. If not running under regrtest.py, ++ all resources are assumed enabled unless use_resources has been set. ++ """ ++ return use_resources is None or resource in use_resources + + def requires(resource, msg=None): +- """Raise ResourceDenied if the specified resource is not available. +- +- If the caller's module is __main__ then automatically return True. The +- possibility of False being returned occurs when regrtest.py is +- executing. +- """ ++ """Raise ResourceDenied if the specified resource is not available.""" + if resource == 'gui' and not _is_gui_available(): + raise ResourceDenied(_is_gui_available.reason) +- # see if the caller's module is __main__ - if so, treat as if +- # the resource was set +- if sys._getframe(1).f_globals.get("__name__") == "__main__": +- return + if not is_resource_enabled(resource): + if msg is None: + msg = "Use of the %r resource not enabled" % resource +diff -r c0e311e010fc Lib/test/test__osx_support.py +--- a/Lib/test/test__osx_support.py ++++ b/Lib/test/test__osx_support.py +@@ -109,7 +109,9 @@ + + def test__supports_universal_builds(self): + import platform +- self.assertEqual(platform.mac_ver()[0].split('.') >= ['10', '4'], ++ mac_ver_tuple = tuple(int(i) for i in ++ platform.mac_ver()[0].split('.')[0:2]) ++ self.assertEqual(mac_ver_tuple >= (10, 4), + _osx_support._supports_universal_builds()) + + def test__find_appropriate_compiler(self): +diff -r c0e311e010fc Lib/test/test_argparse.py +--- a/Lib/test/test_argparse.py ++++ b/Lib/test/test_argparse.py +@@ -4551,6 +4551,12 @@ + self.assertTrue(ns2 != ns3) + self.assertTrue(ns2 != ns4) + ++ def test_equality_returns_notimplemeted(self): ++ # See issue 21481 ++ ns = argparse.Namespace(a=1, b=2) ++ self.assertIs(ns.__eq__(None), NotImplemented) ++ self.assertIs(ns.__ne__(None), NotImplemented) ++ + + # =================== + # File encoding tests +diff -r c0e311e010fc Lib/test/test_asynchat.py +--- a/Lib/test/test_asynchat.py ++++ b/Lib/test/test_asynchat.py +@@ -5,9 +5,14 @@ + # If this fails, the test will be skipped. + thread = support.import_module('_thread') + +-import asyncore, asynchat, socket, time ++import asynchat ++import asyncore ++import errno ++import socket ++import sys ++import time + import unittest +-import sys ++import unittest.mock + try: + import threading + except ImportError: +@@ -28,8 +33,8 @@ + self.event = event + self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + self.port = support.bind_port(self.sock) +- # This will be set if the client wants us to wait before echoing data +- # back. ++ # This will be set if the client wants us to wait before echoing ++ # data back. + self.start_resend_event = None + + def run(self): +@@ -52,8 +57,8 @@ + + # re-send entire set of collected data + try: +- # this may fail on some tests, such as test_close_when_done, since +- # the client closes the channel when it's done sending ++ # this may fail on some tests, such as test_close_when_done, ++ # since the client closes the channel when it's done sending + while self.buffer: + n = conn.send(self.buffer[:self.chunk_size]) + time.sleep(0.001) +@@ -96,7 +101,7 @@ + s.start() + event.wait() + event.clear() +- time.sleep(0.01) # Give server time to start accepting. ++ time.sleep(0.01) # Give server time to start accepting. + return s, event + + +@@ -104,10 +109,10 @@ + class TestAsynchat(unittest.TestCase): + usepoll = False + +- def setUp (self): ++ def setUp(self): + self._threads = support.threading_setup() + +- def tearDown (self): ++ def tearDown(self): + support.threading_cleanup(*self._threads) + + def line_terminator_check(self, term, server_chunk): +@@ -117,7 +122,7 @@ + s.start() + event.wait() + event.clear() +- time.sleep(0.01) # Give server time to start accepting. ++ time.sleep(0.01) # Give server time to start accepting. + c = echo_client(term, s.port) + c.push(b"hello ") + c.push(b"world" + term) +@@ -136,17 +141,17 @@ + + def test_line_terminator1(self): + # test one-character terminator +- for l in (1,2,3): ++ for l in (1, 2, 3): + self.line_terminator_check(b'\n', l) + + def test_line_terminator2(self): + # test two-character terminator +- for l in (1,2,3): ++ for l in (1, 2, 3): + self.line_terminator_check(b'\r\n', l) + + def test_line_terminator3(self): + # test three-character terminator +- for l in (1,2,3): ++ for l in (1, 2, 3): + self.line_terminator_check(b'qqq', l) + + def numeric_terminator_check(self, termlen): +@@ -249,15 +254,48 @@ + # (which could still result in the client not having received anything) + self.assertGreater(len(s.buffer), 0) + ++ def test_push(self): ++ # Issue #12523: push() should raise a TypeError if it doesn't get ++ # a bytes string ++ s, event = start_echo_server() ++ c = echo_client(b'\n', s.port) ++ data = b'bytes\n' ++ c.push(data) ++ c.push(bytearray(data)) ++ c.push(memoryview(data)) ++ self.assertRaises(TypeError, c.push, 10) ++ self.assertRaises(TypeError, c.push, 'unicode') ++ c.push(SERVER_QUIT) ++ asyncore.loop(use_poll=self.usepoll, count=300, timeout=.01) ++ s.join(timeout=TIMEOUT) ++ self.assertEqual(c.contents, [b'bytes', b'bytes', b'bytes']) ++ + + class TestAsynchat_WithPoll(TestAsynchat): + usepoll = True + ++ ++class TestAsynchatMocked(unittest.TestCase): ++ def test_blockingioerror(self): ++ # Issue #16133: handle_read() must ignore BlockingIOError ++ sock = unittest.mock.Mock() ++ sock.recv.side_effect = BlockingIOError(errno.EAGAIN) ++ ++ dispatcher = asynchat.async_chat() ++ dispatcher.set_socket(sock) ++ self.addCleanup(dispatcher.del_channel) ++ ++ with unittest.mock.patch.object(dispatcher, 'handle_error') as error: ++ dispatcher.handle_read() ++ self.assertFalse(error.called) ++ ++ + class TestHelperFunctions(unittest.TestCase): + def test_find_prefix_at_end(self): + self.assertEqual(asynchat.find_prefix_at_end("qwerty\r", "\r\n"), 1) + self.assertEqual(asynchat.find_prefix_at_end("qwertydkjf", "\r\n"), 0) + ++ + class TestFifo(unittest.TestCase): + def test_basic(self): + f = asynchat.fifo() +@@ -283,5 +321,13 @@ + self.assertEqual(f.pop(), (0, None)) + + ++class TestNotConnected(unittest.TestCase): ++ def test_disallow_negative_terminator(self): ++ # Issue #11259 ++ client = asynchat.async_chat() ++ self.assertRaises(ValueError, client.set_terminator, -1) ++ ++ ++ + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/test/test_asyncio/__init__.py +--- a/Lib/test/test_asyncio/__init__.py ++++ b/Lib/test/test_asyncio/__init__.py +@@ -1,29 +1,10 @@ + import os +-import sys +-import unittest +-from test.support import run_unittest, import_module ++from test.support import load_package_tests, import_module + + # Skip tests if we don't have threading. + import_module('threading') + # Skip tests if we don't have concurrent.futures. + import_module('concurrent.futures') + +- +-def suite(): +- tests = unittest.TestSuite() +- loader = unittest.TestLoader() +- for fn in os.listdir(os.path.dirname(__file__)): +- if fn.startswith("test") and fn.endswith(".py"): +- mod_name = 'test.test_asyncio.' + fn[:-3] +- try: +- __import__(mod_name) +- except unittest.SkipTest: +- pass +- else: +- mod = sys.modules[mod_name] +- tests.addTests(loader.loadTestsFromModule(mod)) +- return tests +- +- +-def test_main(): +- run_unittest(suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_asyncio/__main__.py +--- a/Lib/test/test_asyncio/__main__.py ++++ b/Lib/test/test_asyncio/__main__.py +@@ -1,5 +1,4 @@ +-from . import test_main ++from . import load_tests ++import unittest + +- +-if __name__ == '__main__': +- test_main() ++unittest.main() +diff -r c0e311e010fc Lib/test/test_asyncio/test_base_events.py +--- a/Lib/test/test_asyncio/test_base_events.py ++++ b/Lib/test/test_asyncio/test_base_events.py +@@ -7,6 +7,7 @@ + import time + import unittest + from unittest import mock ++from test.script_helper import assert_python_ok + from test.support import IPV6_ENABLED + + import asyncio +@@ -19,12 +20,13 @@ + PY34 = sys.version_info >= (3, 4) + + +-class BaseEventLoopTests(unittest.TestCase): ++class BaseEventLoopTests(test_utils.TestCase): + + def setUp(self): + self.loop = base_events.BaseEventLoop() + self.loop._selector = mock.Mock() +- asyncio.set_event_loop(None) ++ self.loop._selector.select.return_value = () ++ self.set_event_loop(self.loop) + + def test_not_implemented(self): + m = mock.Mock() +@@ -42,8 +44,6 @@ + self.assertRaises( + NotImplementedError, self.loop._write_to_self) + self.assertRaises( +- NotImplementedError, self.loop._read_from_self) +- self.assertRaises( + NotImplementedError, + self.loop._make_read_pipe_transport, m, m) + self.assertRaises( +@@ -52,6 +52,20 @@ + gen = self.loop._make_subprocess_transport(m, m, m, m, m, m, m) + self.assertRaises(NotImplementedError, next, iter(gen)) + ++ def test_close(self): ++ self.assertFalse(self.loop.is_closed()) ++ self.loop.close() ++ self.assertTrue(self.loop.is_closed()) ++ ++ # it should be possible to call close() more than once ++ self.loop.close() ++ self.loop.close() ++ ++ # operation blocked when the loop is closed ++ f = asyncio.Future(loop=self.loop) ++ self.assertRaises(RuntimeError, self.loop.run_forever) ++ self.assertRaises(RuntimeError, self.loop.run_until_complete, f) ++ + def test__add_callback_handle(self): + h = asyncio.Handle(lambda: False, (), self.loop) + +@@ -141,7 +155,7 @@ + pass + + other_loop = base_events.BaseEventLoop() +- other_loop._selector = unittest.mock.Mock() ++ other_loop._selector = mock.Mock() + asyncio.set_event_loop(other_loop) + + # raise RuntimeError if the event loop is different in debug mode +@@ -226,30 +240,27 @@ + self.loop.set_debug(False) + self.assertFalse(self.loop.get_debug()) + +- @mock.patch('asyncio.base_events.time') + @mock.patch('asyncio.base_events.logger') +- def test__run_once_logging(self, m_logger, m_time): ++ def test__run_once_logging(self, m_logger): ++ def slow_select(timeout): ++ # Sleep a bit longer than a second to avoid timer resolution issues. ++ time.sleep(1.1) ++ return [] ++ ++ # logging needs debug flag ++ self.loop.set_debug(True) ++ + # Log to INFO level if timeout > 1.0 sec. +- idx = -1 +- data = [10.0, 10.0, 12.0, 13.0] +- +- def monotonic(): +- nonlocal data, idx +- idx += 1 +- return data[idx] +- +- m_time.monotonic = monotonic +- +- self.loop._scheduled.append( +- asyncio.TimerHandle(11.0, lambda: True, (), self.loop)) ++ self.loop._selector.select = slow_select + self.loop._process_events = mock.Mock() + self.loop._run_once() + self.assertEqual(logging.INFO, m_logger.log.call_args[0][0]) + +- idx = -1 +- data = [10.0, 10.0, 10.3, 13.0] +- self.loop._scheduled = [asyncio.TimerHandle(11.0, lambda: True, (), +- self.loop)] ++ def fast_select(timeout): ++ time.sleep(0.001) ++ return [] ++ ++ self.loop._selector.select = fast_select + self.loop._run_once() + self.assertEqual(logging.DEBUG, m_logger.log.call_args[0][0]) + +@@ -276,6 +287,12 @@ + self.assertRaises(TypeError, + self.loop.run_until_complete, 'blah') + ++ def test_run_until_complete_loop(self): ++ task = asyncio.Future(loop=self.loop) ++ other_loop = self.new_test_loop() ++ self.assertRaises(ValueError, ++ other_loop.run_until_complete, task) ++ + def test_subprocess_exec_invalid_args(self): + args = [sys.executable, '-c', 'pass'] + +@@ -388,19 +405,22 @@ + 1/0 + + def run_loop(): +- self.loop.call_soon(zero_error) ++ handle = self.loop.call_soon(zero_error) + self.loop._run_once() ++ return handle + ++ self.loop.set_debug(True) + self.loop._process_events = mock.Mock() + + mock_handler = mock.Mock() + self.loop.set_exception_handler(mock_handler) +- run_loop() ++ handle = run_loop() + mock_handler.assert_called_with(self.loop, { + 'exception': MOCK_ANY, + 'message': test_utils.MockPattern( + 'Exception in callback.*zero_error'), +- 'handle': MOCK_ANY, ++ 'handle': handle, ++ 'source_traceback': handle._source_traceback, + }) + mock_handler.reset_mock() + +@@ -482,6 +502,52 @@ + self.assertIs(type(_context['context']['exception']), + ZeroDivisionError) + ++ def test_env_var_debug(self): ++ code = '\n'.join(( ++ 'import asyncio', ++ 'loop = asyncio.get_event_loop()', ++ 'print(loop.get_debug())')) ++ ++ # Test with -E to not fail if the unit test was run with ++ # PYTHONASYNCIODEBUG set to a non-empty string ++ sts, stdout, stderr = assert_python_ok('-E', '-c', code) ++ self.assertEqual(stdout.rstrip(), b'False') ++ ++ sts, stdout, stderr = assert_python_ok('-c', code, ++ PYTHONASYNCIODEBUG='') ++ self.assertEqual(stdout.rstrip(), b'False') ++ ++ sts, stdout, stderr = assert_python_ok('-c', code, ++ PYTHONASYNCIODEBUG='1') ++ self.assertEqual(stdout.rstrip(), b'True') ++ ++ sts, stdout, stderr = assert_python_ok('-E', '-c', code, ++ PYTHONASYNCIODEBUG='1') ++ self.assertEqual(stdout.rstrip(), b'False') ++ ++ def test_create_task(self): ++ class MyTask(asyncio.Task): ++ pass ++ ++ @asyncio.coroutine ++ def test(): ++ pass ++ ++ class EventLoop(base_events.BaseEventLoop): ++ def create_task(self, coro): ++ return MyTask(coro, loop=loop) ++ ++ loop = EventLoop() ++ self.set_event_loop(loop) ++ ++ coro = test() ++ task = asyncio.async(coro, loop=loop) ++ self.assertIsInstance(task, MyTask) ++ ++ # make warnings quiet ++ task._log_destroy_pending = False ++ coro.close() ++ + + class MyProto(asyncio.Protocol): + done = None +@@ -541,14 +607,11 @@ + self.done.set_result(None) + + +-class BaseEventLoopWithSelectorTests(unittest.TestCase): ++class BaseEventLoopWithSelectorTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.new_event_loop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.set_event_loop(self.loop) + + @mock.patch('asyncio.base_events.socket') + def test_create_connection_multiple_errors(self, m_socket): +@@ -583,6 +646,27 @@ + + self.assertEqual(str(cm.exception), 'Multiple exceptions: err1, err2') + ++ @mock.patch('asyncio.base_events.socket') ++ def test_create_connection_timeout(self, m_socket): ++ # Ensure that the socket is closed on timeout ++ sock = mock.Mock() ++ m_socket.socket.return_value = sock ++ ++ def getaddrinfo(*args, **kw): ++ fut = asyncio.Future(loop=self.loop) ++ addr = (socket.AF_INET, socket.SOCK_STREAM, 0, '', ++ ('127.0.0.1', 80)) ++ fut.set_result([addr]) ++ return fut ++ self.loop.getaddrinfo = getaddrinfo ++ ++ with mock.patch.object(self.loop, 'sock_connect', ++ side_effect=asyncio.TimeoutError): ++ coro = self.loop.create_connection(MyProto, '127.0.0.1', 80) ++ with self.assertRaises(asyncio.TimeoutError): ++ self.loop.run_until_complete(coro) ++ self.assertTrue(sock.close.called) ++ + def test_create_connection_host_port_sock(self): + coro = self.loop.create_connection( + MyProto, 'example.com', 80, sock=object()) +@@ -944,6 +1028,34 @@ + with self.assertRaises(TypeError): + self.loop.run_in_executor(None, coroutine_function) + ++ @mock.patch('asyncio.base_events.logger') ++ def test_log_slow_callbacks(self, m_logger): ++ def stop_loop_cb(loop): ++ loop.stop() ++ ++ @asyncio.coroutine ++ def stop_loop_coro(loop): ++ yield from () ++ loop.stop() ++ ++ asyncio.set_event_loop(self.loop) ++ self.loop.set_debug(True) ++ self.loop.slow_callback_duration = 0.0 ++ ++ # slow callback ++ self.loop.call_soon(stop_loop_cb, self.loop) ++ self.loop.run_forever() ++ fmt, *args = m_logger.warning.call_args[0] ++ self.assertRegex(fmt % tuple(args), ++ "^Executing took .* seconds$") ++ ++ # slow task ++ asyncio.async(stop_loop_coro(self.loop), loop=self.loop) ++ self.loop.run_forever() ++ fmt, *args = m_logger.warning.call_args[0] ++ self.assertRegex(fmt % tuple(args), ++ "^Executing took .* seconds$") ++ + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/test/test_asyncio/test_events.py +--- a/Lib/test/test_asyncio/test_events.py ++++ b/Lib/test/test_asyncio/test_events.py +@@ -5,6 +5,7 @@ + import io + import os + import platform ++import re + import signal + import socket + try: +@@ -223,7 +224,7 @@ + def setUp(self): + super().setUp() + self.loop = self.create_event_loop() +- asyncio.set_event_loop(None) ++ self.set_event_loop(self.loop) + + def tearDown(self): + # just in case if we have transport close callbacks +@@ -521,6 +522,7 @@ + tr, pr = self.loop.run_until_complete(connection_fut) + self.assertIsInstance(tr, asyncio.Transport) + self.assertIsInstance(pr, asyncio.Protocol) ++ self.assertIs(pr.transport, tr) + if check_sockname: + self.assertIsNotNone(tr.get_extra_info('sockname')) + self.loop.run_until_complete(pr.done) +@@ -713,7 +715,7 @@ + with self.assertRaisesRegex(ValueError, + 'path and sock can not be specified ' + 'at the same time'): +- server = self.loop.run_until_complete(f) ++ self.loop.run_until_complete(f) + + def _create_ssl_context(self, certfile, keyfile=None): + sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) +@@ -817,9 +819,10 @@ + # no CA loaded + f_c = self.loop.create_connection(MyProto, host, port, + ssl=sslcontext_client) +- with self.assertRaisesRegex(ssl.SSLError, +- 'certificate verify failed '): +- self.loop.run_until_complete(f_c) ++ with test_utils.disable_logger(): ++ with self.assertRaisesRegex(ssl.SSLError, ++ 'certificate verify failed '): ++ self.loop.run_until_complete(f_c) + + # close connection + self.assertIsNone(proto.transport) +@@ -843,9 +846,10 @@ + f_c = self.loop.create_unix_connection(MyProto, path, + ssl=sslcontext_client, + server_hostname='invalid') +- with self.assertRaisesRegex(ssl.SSLError, +- 'certificate verify failed '): +- self.loop.run_until_complete(f_c) ++ with test_utils.disable_logger(): ++ with self.assertRaisesRegex(ssl.SSLError, ++ 'certificate verify failed '): ++ self.loop.run_until_complete(f_c) + + # close connection + self.assertIsNone(proto.transport) +@@ -869,10 +873,11 @@ + # incorrect server_hostname + f_c = self.loop.create_connection(MyProto, host, port, + ssl=sslcontext_client) +- with self.assertRaisesRegex( +- ssl.CertificateError, +- "hostname '127.0.0.1' doesn't match 'localhost'"): +- self.loop.run_until_complete(f_c) ++ with test_utils.disable_logger(): ++ with self.assertRaisesRegex( ++ ssl.CertificateError, ++ "hostname '127.0.0.1' doesn't match 'localhost'"): ++ self.loop.run_until_complete(f_c) + + # close connection + proto.transport.close() +@@ -1044,12 +1049,21 @@ + s_transport, server = self.loop.run_until_complete(coro) + host, port = s_transport.get_extra_info('sockname') + ++ self.assertIsInstance(s_transport, asyncio.Transport) ++ self.assertIsInstance(server, TestMyDatagramProto) ++ self.assertEqual('INITIALIZED', server.state) ++ self.assertIs(server.transport, s_transport) ++ + coro = self.loop.create_datagram_endpoint( + lambda: MyDatagramProto(loop=self.loop), + remote_addr=(host, port)) + transport, client = self.loop.run_until_complete(coro) + ++ self.assertIsInstance(transport, asyncio.Transport) ++ self.assertIsInstance(client, MyDatagramProto) + self.assertEqual('INITIALIZED', client.state) ++ self.assertIs(client.transport, transport) ++ + transport.sendto(b'xxx') + test_utils.run_until(self.loop, lambda: server.nbytes) + self.assertEqual(3, server.nbytes) +@@ -1070,6 +1084,7 @@ + def test_internal_fds(self): + loop = self.create_event_loop() + if not isinstance(loop, selector_events.BaseSelectorEventLoop): ++ loop.close() + self.skipTest('loop is not a BaseSelectorEventLoop') + + self.assertEqual(1, loop._internal_fds) +@@ -1363,6 +1378,15 @@ + with self.assertRaises(RuntimeError): + loop.add_writer(w, callback) + ++ def test_close_running_event_loop(self): ++ @asyncio.coroutine ++ def close_loop(loop): ++ self.loop.close() ++ ++ coro = close_loop(self.loop) ++ with self.assertRaises(RuntimeError): ++ self.loop.run_until_complete(coro) ++ + + class SubprocessTestsMixin: + +@@ -1627,14 +1651,14 @@ + + if sys.platform == 'win32': + +- class SelectEventLoopTests(EventLoopTestsMixin, unittest.TestCase): ++ class SelectEventLoopTests(EventLoopTestsMixin, test_utils.TestCase): + + def create_event_loop(self): + return asyncio.SelectorEventLoop() + + class ProactorEventLoopTests(EventLoopTestsMixin, + SubprocessTestsMixin, +- unittest.TestCase): ++ test_utils.TestCase): + + def create_event_loop(self): + return asyncio.ProactorEventLoop() +@@ -1689,7 +1713,7 @@ + if hasattr(selectors, 'KqueueSelector'): + class KqueueEventLoopTests(UnixEventLoopTestsMixin, + SubprocessTestsMixin, +- unittest.TestCase): ++ test_utils.TestCase): + + def create_event_loop(self): + return asyncio.SelectorEventLoop( +@@ -1714,7 +1738,7 @@ + if hasattr(selectors, 'EpollSelector'): + class EPollEventLoopTests(UnixEventLoopTestsMixin, + SubprocessTestsMixin, +- unittest.TestCase): ++ test_utils.TestCase): + + def create_event_loop(self): + return asyncio.SelectorEventLoop(selectors.EpollSelector()) +@@ -1722,7 +1746,7 @@ + if hasattr(selectors, 'PollSelector'): + class PollEventLoopTests(UnixEventLoopTestsMixin, + SubprocessTestsMixin, +- unittest.TestCase): ++ test_utils.TestCase): + + def create_event_loop(self): + return asyncio.SelectorEventLoop(selectors.PollSelector()) +@@ -1730,71 +1754,159 @@ + # Should always exist. + class SelectEventLoopTests(UnixEventLoopTestsMixin, + SubprocessTestsMixin, +- unittest.TestCase): ++ test_utils.TestCase): + + def create_event_loop(self): + return asyncio.SelectorEventLoop(selectors.SelectSelector()) + + +-class HandleTests(unittest.TestCase): ++def noop(*args): ++ pass ++ ++ ++class HandleTests(test_utils.TestCase): ++ ++ def setUp(self): ++ self.loop = mock.Mock() ++ self.loop.get_debug.return_value = True + + def test_handle(self): + def callback(*args): + return args + + args = () +- h = asyncio.Handle(callback, args, mock.Mock()) ++ h = asyncio.Handle(callback, args, self.loop) + self.assertIs(h._callback, callback) + self.assertIs(h._args, args) + self.assertFalse(h._cancelled) + +- r = repr(h) +- self.assertTrue(r.startswith( +- 'Handle(' +- '.callback')) +- self.assertTrue(r.endswith('())')) +- + h.cancel() + self.assertTrue(h._cancelled) + +- r = repr(h) +- self.assertTrue(r.startswith( +- 'Handle(' +- '.callback')) +- self.assertTrue(r.endswith('())'), r) +- + def test_handle_from_handle(self): + def callback(*args): + return args +- m_loop = object() +- h1 = asyncio.Handle(callback, (), loop=m_loop) ++ h1 = asyncio.Handle(callback, (), loop=self.loop) + self.assertRaises( +- AssertionError, asyncio.Handle, h1, (), m_loop) ++ AssertionError, asyncio.Handle, h1, (), self.loop) + + def test_callback_with_exception(self): + def callback(): + raise ValueError() + +- m_loop = mock.Mock() +- m_loop.call_exception_handler = mock.Mock() ++ self.loop = mock.Mock() ++ self.loop.call_exception_handler = mock.Mock() + +- h = asyncio.Handle(callback, (), m_loop) ++ h = asyncio.Handle(callback, (), self.loop) + h._run() + +- m_loop.call_exception_handler.assert_called_with({ ++ self.loop.call_exception_handler.assert_called_with({ + 'message': test_utils.MockPattern('Exception in callback.*'), + 'exception': mock.ANY, +- 'handle': h ++ 'handle': h, ++ 'source_traceback': h._source_traceback, + }) + + def test_handle_weakref(self): + wd = weakref.WeakValueDictionary() +- h = asyncio.Handle(lambda: None, (), object()) ++ h = asyncio.Handle(lambda: None, (), self.loop) + wd['h'] = h # Would fail without __weakref__ slot. + ++ def test_handle_repr(self): ++ self.loop.get_debug.return_value = False ++ ++ # simple function ++ h = asyncio.Handle(noop, (1, 2), self.loop) ++ filename, lineno = test_utils.get_function_source(noop) ++ self.assertEqual(repr(h), ++ '' ++ % (filename, lineno)) ++ ++ # cancelled handle ++ h.cancel() ++ self.assertEqual(repr(h), ++ '') ++ ++ # decorated function ++ cb = asyncio.coroutine(noop) ++ h = asyncio.Handle(cb, (), self.loop) ++ self.assertEqual(repr(h), ++ '' ++ % (filename, lineno)) ++ ++ # partial function ++ cb = functools.partial(noop, 1, 2) ++ h = asyncio.Handle(cb, (3,), self.loop) ++ regex = (r'^$' ++ % (re.escape(filename), lineno)) ++ self.assertRegex(repr(h), regex) ++ ++ # partial method ++ if sys.version_info >= (3, 4): ++ method = HandleTests.test_handle_repr ++ cb = functools.partialmethod(method) ++ filename, lineno = test_utils.get_function_source(method) ++ h = asyncio.Handle(cb, (), self.loop) ++ ++ cb_regex = r'' ++ cb_regex = (r'functools.partialmethod\(%s, , \)\(\)' % cb_regex) ++ regex = (r'^$' ++ % (cb_regex, re.escape(filename), lineno)) ++ self.assertRegex(repr(h), regex) ++ ++ def test_handle_repr_debug(self): ++ self.loop.get_debug.return_value = True ++ ++ # simple function ++ create_filename = __file__ ++ create_lineno = sys._getframe().f_lineno + 1 ++ h = asyncio.Handle(noop, (1, 2), self.loop) ++ filename, lineno = test_utils.get_function_source(noop) ++ self.assertEqual(repr(h), ++ '' ++ % (filename, lineno, create_filename, create_lineno)) ++ ++ # cancelled handle ++ h.cancel() ++ self.assertEqual(repr(h), ++ '' ++ % (create_filename, create_lineno)) ++ ++ def test_handle_source_traceback(self): ++ loop = asyncio.get_event_loop_policy().new_event_loop() ++ loop.set_debug(True) ++ self.set_event_loop(loop) ++ ++ def check_source_traceback(h): ++ lineno = sys._getframe(1).f_lineno - 1 ++ self.assertIsInstance(h._source_traceback, list) ++ self.assertEqual(h._source_traceback[-1][:3], ++ (__file__, ++ lineno, ++ 'test_handle_source_traceback')) ++ ++ # call_soon ++ h = loop.call_soon(noop) ++ check_source_traceback(h) ++ ++ # call_soon_threadsafe ++ h = loop.call_soon_threadsafe(noop) ++ check_source_traceback(h) ++ ++ # call_later ++ h = loop.call_later(0, noop) ++ check_source_traceback(h) ++ ++ # call_at ++ h = loop.call_later(0, noop) ++ check_source_traceback(h) ++ + + class TimerTests(unittest.TestCase): + ++ def setUp(self): ++ self.loop = mock.Mock() ++ + def test_hash(self): + when = time.monotonic() + h = asyncio.TimerHandle(when, lambda: False, (), +@@ -1805,36 +1917,66 @@ + def callback(*args): + return args + +- args = () ++ args = (1, 2, 3) + when = time.monotonic() + h = asyncio.TimerHandle(when, callback, args, mock.Mock()) + self.assertIs(h._callback, callback) + self.assertIs(h._args, args) + self.assertFalse(h._cancelled) + +- r = repr(h) +- self.assertTrue(r.endswith('())')) +- ++ # cancel + h.cancel() + self.assertTrue(h._cancelled) ++ self.assertIsNone(h._callback) ++ self.assertIsNone(h._args) + +- r = repr(h) +- self.assertTrue(r.endswith('())'), r) +- ++ # when cannot be None + self.assertRaises(AssertionError, + asyncio.TimerHandle, None, callback, args, +- mock.Mock()) ++ self.loop) ++ ++ def test_timer_repr(self): ++ self.loop.get_debug.return_value = False ++ ++ # simple function ++ h = asyncio.TimerHandle(123, noop, (), self.loop) ++ src = test_utils.get_function_source(noop) ++ self.assertEqual(repr(h), ++ '' % src) ++ ++ # cancelled handle ++ h.cancel() ++ self.assertEqual(repr(h), ++ '') ++ ++ def test_timer_repr_debug(self): ++ self.loop.get_debug.return_value = True ++ ++ # simple function ++ create_filename = __file__ ++ create_lineno = sys._getframe().f_lineno + 1 ++ h = asyncio.TimerHandle(123, noop, (), self.loop) ++ filename, lineno = test_utils.get_function_source(noop) ++ self.assertEqual(repr(h), ++ '' ++ % (filename, lineno, create_filename, create_lineno)) ++ ++ # cancelled handle ++ h.cancel() ++ self.assertEqual(repr(h), ++ '' ++ % (create_filename, create_lineno)) ++ + + def test_timer_comparison(self): +- loop = mock.Mock() +- + def callback(*args): + return args + + when = time.monotonic() + +- h1 = asyncio.TimerHandle(when, callback, (), loop) +- h2 = asyncio.TimerHandle(when, callback, (), loop) ++ h1 = asyncio.TimerHandle(when, callback, (), self.loop) ++ h2 = asyncio.TimerHandle(when, callback, (), self.loop) + # TODO: Use assertLess etc. + self.assertFalse(h1 < h2) + self.assertFalse(h2 < h1) +@@ -1850,8 +1992,8 @@ + h2.cancel() + self.assertFalse(h1 == h2) + +- h1 = asyncio.TimerHandle(when, callback, (), loop) +- h2 = asyncio.TimerHandle(when + 10.0, callback, (), loop) ++ h1 = asyncio.TimerHandle(when, callback, (), self.loop) ++ h2 = asyncio.TimerHandle(when + 10.0, callback, (), self.loop) + self.assertTrue(h1 < h2) + self.assertFalse(h2 < h1) + self.assertTrue(h1 <= h2) +@@ -1863,7 +2005,7 @@ + self.assertFalse(h1 == h2) + self.assertTrue(h1 != h2) + +- h3 = asyncio.Handle(callback, (), loop) ++ h3 = asyncio.Handle(callback, (), self.loop) + self.assertIs(NotImplemented, h1.__eq__(h3)) + self.assertIs(NotImplemented, h1.__ne__(h3)) + +@@ -1882,8 +2024,12 @@ + self.assertRaises( + NotImplementedError, loop.is_running) + self.assertRaises( ++ NotImplementedError, loop.is_closed) ++ self.assertRaises( + NotImplementedError, loop.close) + self.assertRaises( ++ NotImplementedError, loop.create_task, None) ++ self.assertRaises( + NotImplementedError, loop.call_later, None, None) + self.assertRaises( + NotImplementedError, loop.call_at, f, f) +@@ -1940,6 +2086,16 @@ + mock.sentinel) + self.assertRaises( + NotImplementedError, loop.subprocess_exec, f) ++ self.assertRaises( ++ NotImplementedError, loop.set_exception_handler, f) ++ self.assertRaises( ++ NotImplementedError, loop.default_exception_handler, f) ++ self.assertRaises( ++ NotImplementedError, loop.call_exception_handler, f) ++ self.assertRaises( ++ NotImplementedError, loop.get_debug) ++ self.assertRaises( ++ NotImplementedError, loop.set_debug, f) + + + class ProtocolsAbsTests(unittest.TestCase): +diff -r c0e311e010fc Lib/test/test_asyncio/test_futures.py +--- a/Lib/test/test_asyncio/test_futures.py ++++ b/Lib/test/test_asyncio/test_futures.py +@@ -1,8 +1,11 @@ + """Tests for futures.py.""" + + import concurrent.futures ++import re ++import sys + import threading + import unittest ++from test import support + from unittest import mock + + import asyncio +@@ -12,15 +15,17 @@ + def _fakefunc(f): + return f + ++def first_cb(): ++ pass + +-class FutureTests(unittest.TestCase): ++def last_cb(): ++ pass ++ ++ ++class FutureTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def test_initial_state(self): + f = asyncio.Future(loop=self.loop) +@@ -30,12 +35,9 @@ + self.assertTrue(f.cancelled()) + + def test_init_constructor_default_loop(self): +- try: +- asyncio.set_event_loop(self.loop) +- f = asyncio.Future() +- self.assertIs(f._loop, self.loop) +- finally: +- asyncio.set_event_loop(None) ++ asyncio.set_event_loop(self.loop) ++ f = asyncio.Future() ++ self.assertIs(f._loop, self.loop) + + def test_constructor_positional(self): + # Make sure Future doesn't accept a positional argument +@@ -102,39 +104,60 @@ + # The second "yield from f" does not yield f. + self.assertEqual(next(g), ('C', 42)) # yield 'C', y. + +- def test_repr(self): ++ def test_future_repr(self): + f_pending = asyncio.Future(loop=self.loop) +- self.assertEqual(repr(f_pending), 'Future') ++ self.assertEqual(repr(f_pending), '') + f_pending.cancel() + + f_cancelled = asyncio.Future(loop=self.loop) + f_cancelled.cancel() +- self.assertEqual(repr(f_cancelled), 'Future') ++ self.assertEqual(repr(f_cancelled), '') + + f_result = asyncio.Future(loop=self.loop) + f_result.set_result(4) +- self.assertEqual(repr(f_result), 'Future') ++ self.assertEqual(repr(f_result), '') + self.assertEqual(f_result.result(), 4) + + exc = RuntimeError() + f_exception = asyncio.Future(loop=self.loop) + f_exception.set_exception(exc) +- self.assertEqual(repr(f_exception), 'Future') ++ self.assertEqual(repr(f_exception), '') + self.assertIs(f_exception.exception(), exc) + +- f_few_callbacks = asyncio.Future(loop=self.loop) +- f_few_callbacks.add_done_callback(_fakefunc) +- self.assertIn('Future' % fake_repr) ++ f_one_callbacks.cancel() ++ self.assertEqual(repr(f_one_callbacks), ++ '') ++ ++ f_two_callbacks = asyncio.Future(loop=self.loop) ++ f_two_callbacks.add_done_callback(first_cb) ++ f_two_callbacks.add_done_callback(last_cb) ++ first_repr = func_repr(first_cb) ++ last_repr = func_repr(last_cb) ++ self.assertRegex(repr(f_two_callbacks), ++ r'' ++ % (first_repr, last_repr)) + + f_many_callbacks = asyncio.Future(loop=self.loop) +- for i in range(20): ++ f_many_callbacks.add_done_callback(first_cb) ++ for i in range(8): + f_many_callbacks.add_done_callback(_fakefunc) +- r = repr(f_many_callbacks) +- self.assertIn('Future', r) ++ f_many_callbacks.add_done_callback(last_cb) ++ cb_regex = r'%s, <8 more>, %s' % (first_repr, last_repr) ++ self.assertRegex(repr(f_many_callbacks), ++ r'' % cb_regex) + f_many_callbacks.cancel() ++ self.assertEqual(repr(f_many_callbacks), ++ '') + + def test_copy_state(self): + # Test the internal _copy_state method since it's being directly +@@ -263,15 +286,80 @@ + self.assertEqual(f1.result(), 42) + self.assertTrue(f2.cancelled()) + ++ def test_future_source_traceback(self): ++ self.loop.set_debug(True) + +-class FutureDoneCallbackTests(unittest.TestCase): ++ future = asyncio.Future(loop=self.loop) ++ lineno = sys._getframe().f_lineno - 1 ++ self.assertIsInstance(future._source_traceback, list) ++ self.assertEqual(future._source_traceback[-1][:3], ++ (__file__, ++ lineno, ++ 'test_future_source_traceback')) ++ ++ @mock.patch('asyncio.base_events.logger') ++ def test_future_exception_never_retrieved(self, m_log): ++ # FIXME: Python issue #21163, other tests may "leak" pending task which ++ # emit a warning when they are destroyed by the GC ++ support.gc_collect() ++ m_log.error.reset_mock() ++ # --- ++ ++ self.loop.set_debug(True) ++ ++ def memory_error(): ++ try: ++ raise MemoryError() ++ except BaseException as exc: ++ return exc ++ exc = memory_error() ++ ++ future = asyncio.Future(loop=self.loop) ++ source_traceback = future._source_traceback ++ future.set_exception(exc) ++ future = None ++ test_utils.run_briefly(self.loop) ++ support.gc_collect() ++ ++ if sys.version_info >= (3, 4): ++ frame = source_traceback[-1] ++ regex = (r'^Future exception was never retrieved\n' ++ r'future: \n' ++ r'source_traceback: Object created at \(most recent call last\):\n' ++ r' File' ++ r'.*\n' ++ r' File "{filename}", line {lineno}, in test_future_exception_never_retrieved\n' ++ r' future = asyncio\.Future\(loop=self\.loop\)$' ++ ).format(filename=re.escape(frame[0]), lineno=frame[1]) ++ exc_info = (type(exc), exc, exc.__traceback__) ++ m_log.error.assert_called_once_with(mock.ANY, exc_info=exc_info) ++ else: ++ frame = source_traceback[-1] ++ regex = (r'^Future/Task exception was never retrieved\n' ++ r'Future/Task created at \(most recent call last\):\n' ++ r' File' ++ r'.*\n' ++ r' File "{filename}", line {lineno}, in test_future_exception_never_retrieved\n' ++ r' future = asyncio\.Future\(loop=self\.loop\)\n' ++ r'Traceback \(most recent call last\):\n' ++ r'.*\n' ++ r'MemoryError$' ++ ).format(filename=re.escape(frame[0]), lineno=frame[1]) ++ m_log.error.assert_called_once_with(mock.ANY, exc_info=False) ++ message = m_log.error.call_args[0][0] ++ self.assertRegex(message, re.compile(regex, re.DOTALL)) ++ ++ def test_set_result_unless_cancelled(self): ++ fut = asyncio.Future(loop=self.loop) ++ fut.cancel() ++ fut._set_result_unless_cancelled(2) ++ self.assertTrue(fut.cancelled()) ++ ++ ++class FutureDoneCallbackTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def run_briefly(self): + test_utils.run_briefly(self.loop) +diff -r c0e311e010fc Lib/test/test_asyncio/test_locks.py +--- a/Lib/test/test_asyncio/test_locks.py ++++ b/Lib/test/test_asyncio/test_locks.py +@@ -17,14 +17,10 @@ + RGX_REPR = re.compile(STR_RGX_REPR) + + +-class LockTests(unittest.TestCase): ++class LockTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def test_ctor_loop(self): + loop = mock.Mock() +@@ -35,12 +31,9 @@ + self.assertIs(lock._loop, self.loop) + + def test_ctor_noloop(self): +- try: +- asyncio.set_event_loop(self.loop) +- lock = asyncio.Lock() +- self.assertIs(lock._loop, self.loop) +- finally: +- asyncio.set_event_loop(None) ++ asyncio.set_event_loop(self.loop) ++ lock = asyncio.Lock() ++ self.assertIs(lock._loop, self.loop) + + def test_repr(self): + lock = asyncio.Lock(loop=self.loop) +@@ -240,14 +233,10 @@ + self.assertFalse(lock.locked()) + + +-class EventTests(unittest.TestCase): ++class EventTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def test_ctor_loop(self): + loop = mock.Mock() +@@ -258,12 +247,9 @@ + self.assertIs(ev._loop, self.loop) + + def test_ctor_noloop(self): +- try: +- asyncio.set_event_loop(self.loop) +- ev = asyncio.Event() +- self.assertIs(ev._loop, self.loop) +- finally: +- asyncio.set_event_loop(None) ++ asyncio.set_event_loop(self.loop) ++ ev = asyncio.Event() ++ self.assertIs(ev._loop, self.loop) + + def test_repr(self): + ev = asyncio.Event(loop=self.loop) +@@ -376,14 +362,10 @@ + self.assertTrue(t.result()) + + +-class ConditionTests(unittest.TestCase): ++class ConditionTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def test_ctor_loop(self): + loop = mock.Mock() +@@ -394,12 +376,9 @@ + self.assertIs(cond._loop, self.loop) + + def test_ctor_noloop(self): +- try: +- asyncio.set_event_loop(self.loop) +- cond = asyncio.Condition() +- self.assertIs(cond._loop, self.loop) +- finally: +- asyncio.set_event_loop(None) ++ asyncio.set_event_loop(self.loop) ++ cond = asyncio.Condition() ++ self.assertIs(cond._loop, self.loop) + + def test_wait(self): + cond = asyncio.Condition(loop=self.loop) +@@ -678,14 +657,10 @@ + self.assertFalse(cond.locked()) + + +-class SemaphoreTests(unittest.TestCase): ++class SemaphoreTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + def test_ctor_loop(self): + loop = mock.Mock() +@@ -696,12 +671,9 @@ + self.assertIs(sem._loop, self.loop) + + def test_ctor_noloop(self): +- try: +- asyncio.set_event_loop(self.loop) +- sem = asyncio.Semaphore() +- self.assertIs(sem._loop, self.loop) +- finally: +- asyncio.set_event_loop(None) ++ asyncio.set_event_loop(self.loop) ++ sem = asyncio.Semaphore() ++ self.assertIs(sem._loop, self.loop) + + def test_initial_value_zero(self): + sem = asyncio.Semaphore(0, loop=self.loop) +@@ -811,6 +783,7 @@ + + # cleanup locked semaphore + sem.release() ++ self.loop.run_until_complete(t4) + + def test_acquire_cancel(self): + sem = asyncio.Semaphore(loop=self.loop) +diff -r c0e311e010fc Lib/test/test_asyncio/test_proactor_events.py +--- a/Lib/test/test_asyncio/test_proactor_events.py ++++ b/Lib/test/test_asyncio/test_proactor_events.py +@@ -12,10 +12,10 @@ + from asyncio import test_utils + + +-class ProactorSocketTransportTests(unittest.TestCase): ++class ProactorSocketTransportTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + self.proactor = mock.Mock() + self.loop._proactor = self.proactor + self.protocol = test_utils.make_test_protocol(asyncio.Protocol) +@@ -343,7 +343,7 @@ + tr.close() + + +-class BaseProactorEventLoopTests(unittest.TestCase): ++class BaseProactorEventLoopTests(test_utils.TestCase): + + def setUp(self): + self.sock = mock.Mock(socket.socket) +@@ -356,17 +356,19 @@ + return (self.ssock, self.csock) + + self.loop = EventLoop(self.proactor) ++ self.set_event_loop(self.loop, cleanup=False) + +- @mock.patch.object(BaseProactorEventLoop, 'call_soon') ++ @mock.patch.object(BaseProactorEventLoop, '_call_soon') + @mock.patch.object(BaseProactorEventLoop, '_socketpair') +- def test_ctor(self, socketpair, call_soon): ++ def test_ctor(self, socketpair, _call_soon): + ssock, csock = socketpair.return_value = ( + mock.Mock(), mock.Mock()) + loop = BaseProactorEventLoop(self.proactor) + self.assertIs(loop._ssock, ssock) + self.assertIs(loop._csock, csock) + self.assertEqual(loop._internal_fds, 1) +- call_soon.assert_called_with(loop._loop_self_reading) ++ _call_soon.assert_called_with(loop._loop_self_reading, (), ++ check_loop=False) + + def test_close_self_pipe(self): + self.loop._close_self_pipe() +@@ -433,7 +435,7 @@ + + def test_write_to_self(self): + self.loop._write_to_self() +- self.csock.send.assert_called_with(b'x') ++ self.csock.send.assert_called_with(b'\0') + + def test_process_events(self): + self.loop._process_events([]) +diff -r c0e311e010fc Lib/test/test_asyncio/test_queues.py +--- a/Lib/test/test_asyncio/test_queues.py ++++ b/Lib/test/test_asyncio/test_queues.py +@@ -7,14 +7,10 @@ + from asyncio import test_utils + + +-class _QueueTestBase(unittest.TestCase): ++class _QueueTestBase(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.loop = self.new_test_loop() + + + class QueueBasicTests(_QueueTestBase): +@@ -32,8 +28,7 @@ + self.assertAlmostEqual(0.2, when) + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + q = asyncio.Queue(loop=loop) + self.assertTrue(fn(q).startswith('= (3, 4)) ++PY35 = (sys.version_info >= (3, 5)) ++ ++ + @asyncio.coroutine + def coroutine_function(): + pass + + ++def format_coroutine(qualname, state, src, source_traceback, generator=False): ++ if generator: ++ state = '%s' % state ++ else: ++ state = '%s, defined' % state ++ if source_traceback is not None: ++ frame = source_traceback[-1] ++ return ('coro=<%s() %s at %s> created at %s:%s' ++ % (qualname, state, src, frame[0], frame[1])) ++ else: ++ return 'coro=<%s() %s at %s>' % (qualname, state, src) ++ ++ + class Dummy: + + def __repr__(self): +- return 'Dummy()' ++ return '' + + def __call__(self, *args): + pass + + +-class TaskTests(unittest.TestCase): ++class TaskTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() +- gc.collect() ++ self.loop = self.new_test_loop() + + def test_task_class(self): + @asyncio.coroutine +@@ -46,8 +61,10 @@ + self.assertIs(t._loop, self.loop) + + loop = asyncio.new_event_loop() ++ self.set_event_loop(loop) + t = asyncio.Task(notmuch(), loop=loop) + self.assertIs(t._loop, loop) ++ loop.run_until_complete(t) + loop.close() + + def test_async_coroutine(self): +@@ -61,8 +78,10 @@ + self.assertIs(t._loop, self.loop) + + loop = asyncio.new_event_loop() ++ self.set_event_loop(loop) + t = asyncio.async(notmuch(), loop=loop) + self.assertIs(t._loop, loop) ++ loop.run_until_complete(t) + loop.close() + + def test_async_future(self): +@@ -76,6 +95,7 @@ + self.assertIs(f, f_orig) + + loop = asyncio.new_event_loop() ++ self.set_event_loop(loop) + + with self.assertRaises(ValueError): + f = asyncio.async(f_orig, loop=loop) +@@ -97,6 +117,7 @@ + self.assertIs(t, t_orig) + + loop = asyncio.new_event_loop() ++ self.set_event_loop(loop) + + with self.assertRaises(ValueError): + t = asyncio.async(t_orig, loop=loop) +@@ -116,35 +137,133 @@ + yield from [] + return 'abc' + +- t = asyncio.Task(notmuch(), loop=self.loop) ++ # test coroutine function ++ self.assertEqual(notmuch.__name__, 'notmuch') ++ if PY35: ++ self.assertEqual(notmuch.__qualname__, ++ 'TaskTests.test_task_repr..notmuch') ++ self.assertEqual(notmuch.__module__, __name__) ++ ++ filename, lineno = test_utils.get_function_source(notmuch) ++ src = "%s:%s" % (filename, lineno) ++ ++ # test coroutine object ++ gen = notmuch() ++ if coroutines._DEBUG or PY35: ++ coro_qualname = 'TaskTests.test_task_repr..notmuch' ++ else: ++ coro_qualname = 'notmuch' ++ self.assertEqual(gen.__name__, 'notmuch') ++ if PY35: ++ self.assertEqual(gen.__qualname__, ++ coro_qualname) ++ ++ # test pending Task ++ t = asyncio.Task(gen, loop=self.loop) + t.add_done_callback(Dummy()) +- self.assertEqual(repr(t), 'Task()') ++ ++ coro = format_coroutine(coro_qualname, 'running', src, ++ t._source_traceback, generator=True) ++ self.assertEqual(repr(t), ++ '()]>' % coro) ++ ++ # test cancelling Task + t.cancel() # Does not take immediate effect! +- self.assertEqual(repr(t), 'Task()') ++ self.assertEqual(repr(t), ++ '()]>' % coro) ++ ++ # test cancelled Task + self.assertRaises(asyncio.CancelledError, + self.loop.run_until_complete, t) +- self.assertEqual(repr(t), 'Task()') ++ coro = format_coroutine(coro_qualname, 'done', src, ++ t._source_traceback) ++ self.assertEqual(repr(t), ++ '' % coro) ++ ++ # test finished Task + t = asyncio.Task(notmuch(), loop=self.loop) + self.loop.run_until_complete(t) +- self.assertEqual(repr(t), "Task()") ++ coro = format_coroutine(coro_qualname, 'done', src, ++ t._source_traceback) ++ self.assertEqual(repr(t), ++ "" % coro) + +- def test_task_repr_custom(self): ++ def test_task_repr_coro_decorator(self): + @asyncio.coroutine +- def coro(): +- pass ++ def notmuch(): ++ # notmuch() function doesn't use yield from: it will be wrapped by ++ # @coroutine decorator ++ return 123 + +- class T(asyncio.Future): +- def __repr__(self): +- return 'T[]' ++ # test coroutine function ++ self.assertEqual(notmuch.__name__, 'notmuch') ++ if PY35: ++ self.assertEqual(notmuch.__qualname__, ++ 'TaskTests.test_task_repr_coro_decorator..notmuch') ++ self.assertEqual(notmuch.__module__, __name__) + +- class MyTask(asyncio.Task, T): +- def __repr__(self): +- return super().__repr__() ++ # test coroutine object ++ gen = notmuch() ++ if coroutines._DEBUG or PY35: ++ # On Python >= 3.5, generators now inherit the name of the ++ # function, as expected, and have a qualified name (__qualname__ ++ # attribute). ++ coro_name = 'notmuch' ++ coro_qualname = 'TaskTests.test_task_repr_coro_decorator..notmuch' ++ else: ++ # On Python < 3.5, generators inherit the name of the code, not of ++ # the function. See: http://bugs.python.org/issue21205 ++ coro_name = coro_qualname = 'coro' ++ self.assertEqual(gen.__name__, coro_name) ++ if PY35: ++ self.assertEqual(gen.__qualname__, coro_qualname) + +- gen = coro() +- t = MyTask(gen, loop=self.loop) +- self.assertEqual(repr(t), 'T[]()') +- gen.close() ++ # test repr(CoroWrapper) ++ if coroutines._DEBUG: ++ # format the coroutine object ++ if coroutines._DEBUG: ++ filename, lineno = test_utils.get_function_source(notmuch) ++ frame = gen._source_traceback[-1] ++ coro = ('%s() running, defined at %s:%s, created at %s:%s' ++ % (coro_qualname, filename, lineno, ++ frame[0], frame[1])) ++ else: ++ code = gen.gi_code ++ coro = ('%s() running at %s:%s' ++ % (coro_qualname, code.co_filename, code.co_firstlineno)) ++ ++ self.assertEqual(repr(gen), '' % coro) ++ ++ # test pending Task ++ t = asyncio.Task(gen, loop=self.loop) ++ t.add_done_callback(Dummy()) ++ ++ # format the coroutine object ++ if coroutines._DEBUG: ++ src = '%s:%s' % test_utils.get_function_source(notmuch) ++ else: ++ code = gen.gi_code ++ src = '%s:%s' % (code.co_filename, code.co_firstlineno) ++ coro = format_coroutine(coro_qualname, 'running', src, ++ t._source_traceback, ++ generator=not coroutines._DEBUG) ++ self.assertEqual(repr(t), ++ '()]>' % coro) ++ self.loop.run_until_complete(t) ++ ++ def test_task_repr_wait_for(self): ++ @asyncio.coroutine ++ def wait_for(fut): ++ return (yield from fut) ++ ++ fut = asyncio.Future(loop=self.loop) ++ task = asyncio.Task(wait_for(fut), loop=self.loop) ++ test_utils.run_briefly(self.loop) ++ self.assertRegex(repr(task), ++ '' % re.escape(repr(fut))) ++ ++ fut.set_result(None) ++ self.loop.run_until_complete(task) + + def test_task_basics(self): + @asyncio.coroutine +@@ -171,8 +290,7 @@ + self.assertAlmostEqual(10.0, when) + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + @asyncio.coroutine + def task(): +@@ -297,7 +415,7 @@ + + def test_cancel_current_task(self): + loop = asyncio.new_event_loop() +- self.addCleanup(loop.close) ++ self.set_event_loop(loop) + + @asyncio.coroutine + def task(): +@@ -325,8 +443,7 @@ + self.assertAlmostEqual(0.3, when) + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + x = 0 + waiters = [] +@@ -342,8 +459,10 @@ + loop.stop() + + t = asyncio.Task(task(), loop=loop) +- self.assertRaises( +- RuntimeError, loop.run_until_complete, t) ++ with self.assertRaises(RuntimeError) as cm: ++ loop.run_until_complete(t) ++ self.assertEqual(str(cm.exception), ++ 'Event loop stopped before Future completed.') + self.assertFalse(t.done()) + self.assertEqual(x, 2) + self.assertAlmostEqual(0.3, loop.time()) +@@ -351,6 +470,8 @@ + # close generators + for w in waiters: + w.close() ++ t.cancel() ++ self.assertRaises(asyncio.CancelledError, loop.run_until_complete, t) + + def test_wait_for(self): + +@@ -361,8 +482,7 @@ + self.assertAlmostEqual(0.1, when) + when = yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + foo_running = None + +@@ -387,8 +507,7 @@ + self.assertEqual(foo_running, False) + + def test_wait_for_blocking(self): +- loop = test_utils.TestLoop() +- self.addCleanup(loop.close) ++ loop = self.new_test_loop() + + @asyncio.coroutine + def coro(): +@@ -408,8 +527,7 @@ + self.assertAlmostEqual(0.01, when) + yield 0.01 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + @asyncio.coroutine + def foo(): +@@ -437,8 +555,7 @@ + self.assertAlmostEqual(0.15, when) + yield 0.15 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(0.1, loop=loop), loop=loop) + b = asyncio.Task(asyncio.sleep(0.15, loop=loop), loop=loop) +@@ -468,8 +585,7 @@ + self.assertAlmostEqual(0.015, when) + yield 0.015 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(0.01, loop=loop), loop=loop) + b = asyncio.Task(asyncio.sleep(0.015, loop=loop), loop=loop) +@@ -482,11 +598,8 @@ + return 42 + + asyncio.set_event_loop(loop) +- try: +- res = loop.run_until_complete( +- asyncio.Task(foo(), loop=loop)) +- finally: +- asyncio.set_event_loop(None) ++ res = loop.run_until_complete( ++ asyncio.Task(foo(), loop=loop)) + + self.assertEqual(res, 42) + +@@ -510,10 +623,13 @@ + ValueError, self.loop.run_until_complete, + asyncio.wait(set(), loop=self.loop)) + +- self.assertRaises( +- ValueError, self.loop.run_until_complete, +- asyncio.wait([asyncio.sleep(10.0, loop=self.loop)], +- return_when=-1, loop=self.loop)) ++ # -1 is an invalid return_when value ++ sleep_coro = asyncio.sleep(10.0, loop=self.loop) ++ wait_coro = asyncio.wait([sleep_coro], return_when=-1, loop=self.loop) ++ self.assertRaises(ValueError, ++ self.loop.run_until_complete, wait_coro) ++ ++ sleep_coro.close() + + def test_wait_first_completed(self): + +@@ -524,8 +640,7 @@ + self.assertAlmostEqual(0.1, when) + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(10.0, loop=loop), loop=loop) + b = asyncio.Task(asyncio.sleep(0.1, loop=loop), loop=loop) +@@ -580,8 +695,7 @@ + self.assertAlmostEqual(10.0, when) + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + # first_exception, task already has exception + a = asyncio.Task(asyncio.sleep(10.0, loop=loop), loop=loop) +@@ -614,8 +728,7 @@ + self.assertAlmostEqual(0.01, when) + yield 0.01 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + # first_exception, exception during waiting + a = asyncio.Task(asyncio.sleep(10.0, loop=loop), loop=loop) +@@ -647,8 +760,7 @@ + self.assertAlmostEqual(0.15, when) + yield 0.15 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(0.1, loop=loop), loop=loop) + +@@ -684,8 +796,7 @@ + self.assertAlmostEqual(0.11, when) + yield 0.11 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(0.1, loop=loop), loop=loop) + b = asyncio.Task(asyncio.sleep(0.15, loop=loop), loop=loop) +@@ -715,8 +826,7 @@ + self.assertAlmostEqual(0.1, when) + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.Task(asyncio.sleep(0.1, loop=loop), loop=loop) + b = asyncio.Task(asyncio.sleep(0.15, loop=loop), loop=loop) +@@ -740,8 +850,9 @@ + yield 0.01 + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) ++ # disable "slow callback" warning ++ loop.slow_callback_duration = 1.0 + completed = set() + time_shifted = False + +@@ -784,8 +895,7 @@ + yield 0 + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.sleep(0.1, 'a', loop=loop) + b = asyncio.sleep(0.15, 'b', loop=loop) +@@ -821,8 +931,7 @@ + yield 0 + yield 0.01 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.sleep(0.01, 'a', loop=loop) + +@@ -841,8 +950,7 @@ + yield 0.05 + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.sleep(0.05, 'a', loop=loop) + b = asyncio.sleep(0.10, 'b', loop=loop) +@@ -867,8 +975,7 @@ + self.assertAlmostEqual(0.05, when) + yield 0.05 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + a = asyncio.sleep(0.05, 'a', loop=loop) + b = asyncio.sleep(0.05, 'b', loop=loop) +@@ -909,8 +1016,7 @@ + self.assertAlmostEqual(0.1, when) + yield 0.05 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + @asyncio.coroutine + def sleeper(dt, arg): +@@ -931,8 +1037,7 @@ + self.assertAlmostEqual(10.0, when) + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + t = asyncio.Task(asyncio.sleep(10.0, 'yeah', loop=loop), + loop=loop) +@@ -940,9 +1045,9 @@ + handle = None + orig_call_later = loop.call_later + +- def call_later(self, delay, callback, *args): ++ def call_later(delay, callback, *args): + nonlocal handle +- handle = orig_call_later(self, delay, callback, *args) ++ handle = orig_call_later(delay, callback, *args) + return handle + + loop.call_later = call_later +@@ -963,8 +1068,7 @@ + self.assertAlmostEqual(5000, when) + yield 0.1 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + @asyncio.coroutine + def sleep(dt): +@@ -1074,8 +1178,7 @@ + self.assertAlmostEqual(10.0, when) + yield 0 + +- loop = test_utils.TestLoop(gen) +- self.addCleanup(loop.close) ++ loop = self.new_test_loop(gen) + + @asyncio.coroutine + def sleeper(): +@@ -1372,8 +1475,10 @@ + # as_completed() expects a list of futures, not a future instance + self.assertRaises(TypeError, self.loop.run_until_complete, + asyncio.as_completed(fut, loop=self.loop)) ++ coro = coroutine_function() + self.assertRaises(TypeError, self.loop.run_until_complete, +- asyncio.as_completed(coroutine_function(), loop=self.loop)) ++ asyncio.as_completed(coro, loop=self.loop)) ++ coro.close() + + def test_wait_invalid_args(self): + fut = asyncio.Future(loop=self.loop) +@@ -1381,8 +1486,10 @@ + # wait() expects a list of futures, not a future instance + self.assertRaises(TypeError, self.loop.run_until_complete, + asyncio.wait(fut, loop=self.loop)) ++ coro = coroutine_function() + self.assertRaises(TypeError, self.loop.run_until_complete, +- asyncio.wait(coroutine_function(), loop=self.loop)) ++ asyncio.wait(coro, loop=self.loop)) ++ coro.close() + + # wait() expects at least a future + self.assertRaises(ValueError, self.loop.run_until_complete, +@@ -1420,23 +1527,23 @@ + self.assertIsNone(gen.gi_frame) + + # Save debug flag. +- old_debug = asyncio.tasks._DEBUG ++ old_debug = asyncio.coroutines._DEBUG + try: + # Test with debug flag cleared. +- asyncio.tasks._DEBUG = False ++ asyncio.coroutines._DEBUG = False + check() + + # Test with debug flag set. +- asyncio.tasks._DEBUG = True ++ asyncio.coroutines._DEBUG = True + check() + + finally: + # Restore original debug flag. +- asyncio.tasks._DEBUG = old_debug ++ asyncio.coroutines._DEBUG = old_debug + + def test_yield_from_corowrapper(self): +- old_debug = asyncio.tasks._DEBUG +- asyncio.tasks._DEBUG = True ++ old_debug = asyncio.coroutines._DEBUG ++ asyncio.coroutines._DEBUG = True + try: + @asyncio.coroutine + def t1(): +@@ -1456,7 +1563,7 @@ + val = self.loop.run_until_complete(task) + self.assertEqual(val, (1, 2, 3)) + finally: +- asyncio.tasks._DEBUG = old_debug ++ asyncio.coroutines._DEBUG = old_debug + + def test_yield_from_corowrapper_send(self): + def foo(): +@@ -1464,7 +1571,7 @@ + return a + + def call(arg): +- cw = asyncio.tasks.CoroWrapper(foo(), foo) ++ cw = asyncio.coroutines.CoroWrapper(foo(), foo) + cw.send(None) + try: + cw.send(arg) +@@ -1479,20 +1586,102 @@ + def test_corowrapper_weakref(self): + wd = weakref.WeakValueDictionary() + def foo(): yield from [] +- cw = asyncio.tasks.CoroWrapper(foo(), foo) ++ cw = asyncio.coroutines.CoroWrapper(foo(), foo) + wd['cw'] = cw # Would fail without __weakref__ slot. + cw.gen = None # Suppress warning from __del__. + ++ @unittest.skipUnless(PY34, ++ 'need python 3.4 or later') ++ def test_log_destroyed_pending_task(self): ++ @asyncio.coroutine ++ def kill_me(loop): ++ future = asyncio.Future(loop=loop) ++ yield from future ++ # at this point, the only reference to kill_me() task is ++ # the Task._wakeup() method in future._callbacks ++ raise Exception("code never reached") ++ ++ mock_handler = mock.Mock() ++ self.loop.set_debug(True) ++ self.loop.set_exception_handler(mock_handler) ++ ++ # schedule the task ++ coro = kill_me(self.loop) ++ task = asyncio.async(coro, loop=self.loop) ++ self.assertEqual(asyncio.Task.all_tasks(loop=self.loop), {task}) ++ ++ # execute the task so it waits for future ++ self.loop._run_once() ++ self.assertEqual(len(self.loop._ready), 0) ++ ++ # remove the future used in kill_me(), and references to the task ++ del coro.gi_frame.f_locals['future'] ++ coro = None ++ source_traceback = task._source_traceback ++ task = None ++ ++ # no more reference to kill_me() task: the task is destroyed by the GC ++ support.gc_collect() ++ ++ self.assertEqual(asyncio.Task.all_tasks(loop=self.loop), set()) ++ ++ mock_handler.assert_called_with(self.loop, { ++ 'message': 'Task was destroyed but it is pending!', ++ 'task': mock.ANY, ++ 'source_traceback': source_traceback, ++ }) ++ mock_handler.reset_mock() ++ ++ @mock.patch('asyncio.coroutines.logger') ++ def test_coroutine_never_yielded(self, m_log): ++ debug = asyncio.coroutines._DEBUG ++ try: ++ asyncio.coroutines._DEBUG = True ++ @asyncio.coroutine ++ def coro_noop(): ++ pass ++ finally: ++ asyncio.coroutines._DEBUG = debug ++ ++ tb_filename = __file__ ++ tb_lineno = sys._getframe().f_lineno + 2 ++ # create a coroutine object but don't use it ++ coro_noop() ++ support.gc_collect() ++ ++ self.assertTrue(m_log.error.called) ++ message = m_log.error.call_args[0][0] ++ func_filename, func_lineno = test_utils.get_function_source(coro_noop) ++ regex = (r'^ was never yielded from\n' ++ r'Coroutine object created at \(most recent call last\):\n' ++ r'.*\n' ++ r' File "%s", line %s, in test_coroutine_never_yielded\n' ++ r' coro_noop\(\)$' ++ % (re.escape(coro_noop.__qualname__), ++ re.escape(func_filename), func_lineno, ++ re.escape(tb_filename), tb_lineno)) ++ ++ self.assertRegex(message, re.compile(regex, re.DOTALL)) ++ ++ def test_task_source_traceback(self): ++ self.loop.set_debug(True) ++ ++ task = asyncio.Task(coroutine_function(), loop=self.loop) ++ lineno = sys._getframe().f_lineno - 1 ++ self.assertIsInstance(task._source_traceback, list) ++ self.assertEqual(task._source_traceback[-1][:3], ++ (__file__, ++ lineno, ++ 'test_task_source_traceback')) ++ self.loop.run_until_complete(task) ++ + + class GatherTestsBase: + + def setUp(self): +- self.one_loop = test_utils.TestLoop() +- self.other_loop = test_utils.TestLoop() +- +- def tearDown(self): +- self.one_loop.close() +- self.other_loop.close() ++ self.one_loop = self.new_test_loop() ++ self.other_loop = self.new_test_loop() ++ self.set_event_loop(self.one_loop, cleanup=False) + + def _run_loop(self, loop): + while loop._ready: +@@ -1558,13 +1747,9 @@ + self.assertEqual(fut.result(), [3, 1, exc, exc2]) + + def test_env_var_debug(self): +- path = os.path.dirname(asyncio.__file__) +- path = os.path.normpath(os.path.join(path, '..')) + code = '\n'.join(( +- 'import sys', +- 'sys.path.insert(0, %r)' % path, +- 'import asyncio.tasks', +- 'print(asyncio.tasks._DEBUG)')) ++ 'import asyncio.coroutines', ++ 'print(asyncio.coroutines._DEBUG)')) + + # Test with -E to not fail if the unit test was run with + # PYTHONASYNCIODEBUG set to a non-empty string +@@ -1584,7 +1769,7 @@ + self.assertEqual(stdout.rstrip(), b'False') + + +-class FutureGatherTests(GatherTestsBase, unittest.TestCase): ++class FutureGatherTests(GatherTestsBase, test_utils.TestCase): + + def wrap_futures(self, *futures): + return futures +@@ -1668,16 +1853,12 @@ + cb.assert_called_once_with(fut) + + +-class CoroutineGatherTests(GatherTestsBase, unittest.TestCase): ++class CoroutineGatherTests(GatherTestsBase, test_utils.TestCase): + + def setUp(self): + super().setUp() + asyncio.set_event_loop(self.one_loop) + +- def tearDown(self): +- asyncio.set_event_loop(None) +- super().tearDown() +- + def wrap_futures(self, *futures): + coros = [] + for fut in futures: +@@ -1695,14 +1876,14 @@ + gen2 = coro() + fut = asyncio.gather(gen1, gen2) + self.assertIs(fut._loop, self.one_loop) +- gen1.close() +- gen2.close() ++ self.one_loop.run_until_complete(fut) ++ ++ self.set_event_loop(self.other_loop, cleanup=False) + gen3 = coro() + gen4 = coro() +- fut = asyncio.gather(gen3, gen4, loop=self.other_loop) +- self.assertIs(fut._loop, self.other_loop) +- gen3.close() +- gen4.close() ++ fut2 = asyncio.gather(gen3, gen4, loop=self.other_loop) ++ self.assertIs(fut2._loop, self.other_loop) ++ self.other_loop.run_until_complete(fut2) + + def test_duplicate_coroutines(self): + @asyncio.coroutine +diff -r c0e311e010fc Lib/test/test_asyncio/test_unix_events.py +--- a/Lib/test/test_asyncio/test_unix_events.py ++++ b/Lib/test/test_asyncio/test_unix_events.py +@@ -29,14 +29,11 @@ + + + @unittest.skipUnless(signal, 'Signals are not supported') +-class SelectorEventLoopSignalTests(unittest.TestCase): ++class SelectorEventLoopSignalTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.SelectorEventLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.set_event_loop(self.loop) + + def test_check_signal(self): + self.assertRaises( +@@ -45,7 +42,7 @@ + ValueError, self.loop._check_signal, signal.NSIG + 1) + + def test_handle_signal_no_handler(self): +- self.loop._handle_signal(signal.NSIG + 1, ()) ++ self.loop._handle_signal(signal.NSIG + 1) + + def test_handle_signal_cancelled_handler(self): + h = asyncio.Handle(mock.Mock(), (), +@@ -53,7 +50,7 @@ + h.cancel() + self.loop._signal_handlers[signal.NSIG + 1] = h + self.loop.remove_signal_handler = mock.Mock() +- self.loop._handle_signal(signal.NSIG + 1, ()) ++ self.loop._handle_signal(signal.NSIG + 1) + self.loop.remove_signal_handler.assert_called_with(signal.NSIG + 1) + + @mock.patch('asyncio.unix_events.signal') +@@ -208,14 +205,11 @@ + + @unittest.skipUnless(hasattr(socket, 'AF_UNIX'), + 'UNIX Sockets are not supported') +-class SelectorEventLoopUnixSocketTests(unittest.TestCase): ++class SelectorEventLoopUnixSocketTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.SelectorEventLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() ++ self.set_event_loop(self.loop) + + def test_create_unix_server_existing_path_sock(self): + with test_utils.unix_socket_path() as path: +@@ -256,6 +250,24 @@ + 'A UNIX Domain Socket was expected'): + self.loop.run_until_complete(coro) + ++ @mock.patch('asyncio.unix_events.socket') ++ def test_create_unix_server_bind_error(self, m_socket): ++ # Ensure that the socket is closed on any bind error ++ sock = mock.Mock() ++ m_socket.socket.return_value = sock ++ ++ sock.bind.side_effect = OSError ++ coro = self.loop.create_unix_server(lambda: None, path="/test") ++ with self.assertRaises(OSError): ++ self.loop.run_until_complete(coro) ++ self.assertTrue(sock.close.called) ++ ++ sock.bind.side_effect = MemoryError ++ coro = self.loop.create_unix_server(lambda: None, path="/test") ++ with self.assertRaises(MemoryError): ++ self.loop.run_until_complete(coro) ++ self.assertTrue(sock.close.called) ++ + def test_create_unix_connection_path_sock(self): + coro = self.loop.create_unix_connection( + lambda: None, '/dev/null', sock=object()) +@@ -286,10 +298,10 @@ + self.loop.run_until_complete(coro) + + +-class UnixReadPipeTransportTests(unittest.TestCase): ++class UnixReadPipeTransportTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + self.protocol = test_utils.make_test_protocol(asyncio.Protocol) + self.pipe = mock.Mock(spec_set=io.RawIOBase) + self.pipe.fileno.return_value = 5 +@@ -423,6 +435,8 @@ + def test__call_connection_lost(self): + tr = unix_events._UnixReadPipeTransport( + self.loop, self.pipe, self.protocol) ++ self.assertIsNotNone(tr._protocol) ++ self.assertIsNotNone(tr._loop) + + err = None + tr._call_connection_lost(err) +@@ -430,15 +444,13 @@ + self.pipe.close.assert_called_with() + + self.assertIsNone(tr._protocol) +- self.assertEqual(2, sys.getrefcount(self.protocol), +- pprint.pformat(gc.get_referrers(self.protocol))) + self.assertIsNone(tr._loop) +- self.assertEqual(4, sys.getrefcount(self.loop), +- pprint.pformat(gc.get_referrers(self.loop))) + + def test__call_connection_lost_with_err(self): + tr = unix_events._UnixReadPipeTransport( + self.loop, self.pipe, self.protocol) ++ self.assertIsNotNone(tr._protocol) ++ self.assertIsNotNone(tr._loop) + + err = OSError() + tr._call_connection_lost(err) +@@ -446,18 +458,13 @@ + self.pipe.close.assert_called_with() + + self.assertIsNone(tr._protocol) ++ self.assertIsNone(tr._loop) + +- self.assertEqual(2, sys.getrefcount(self.protocol), +- pprint.pformat(gc.get_referrers(self.protocol))) +- self.assertIsNone(tr._loop) +- self.assertEqual(4, sys.getrefcount(self.loop), +- pprint.pformat(gc.get_referrers(self.loop))) + +- +-class UnixWritePipeTransportTests(unittest.TestCase): ++class UnixWritePipeTransportTests(test_utils.TestCase): + + def setUp(self): +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + self.protocol = test_utils.make_test_protocol(asyncio.BaseProtocol) + self.pipe = mock.Mock(spec_set=io.RawIOBase) + self.pipe.fileno.return_value = 5 +@@ -709,6 +716,8 @@ + def test__call_connection_lost(self): + tr = unix_events._UnixWritePipeTransport( + self.loop, self.pipe, self.protocol) ++ self.assertIsNotNone(tr._protocol) ++ self.assertIsNotNone(tr._loop) + + err = None + tr._call_connection_lost(err) +@@ -716,15 +725,13 @@ + self.pipe.close.assert_called_with() + + self.assertIsNone(tr._protocol) +- self.assertEqual(2, sys.getrefcount(self.protocol), +- pprint.pformat(gc.get_referrers(self.protocol))) + self.assertIsNone(tr._loop) +- self.assertEqual(4, sys.getrefcount(self.loop), +- pprint.pformat(gc.get_referrers(self.loop))) + + def test__call_connection_lost_with_err(self): + tr = unix_events._UnixWritePipeTransport( + self.loop, self.pipe, self.protocol) ++ self.assertIsNotNone(tr._protocol) ++ self.assertIsNotNone(tr._loop) + + err = OSError() + tr._call_connection_lost(err) +@@ -732,11 +739,7 @@ + self.pipe.close.assert_called_with() + + self.assertIsNone(tr._protocol) +- self.assertEqual(2, sys.getrefcount(self.protocol), +- pprint.pformat(gc.get_referrers(self.protocol))) + self.assertIsNone(tr._loop) +- self.assertEqual(4, sys.getrefcount(self.loop), +- pprint.pformat(gc.get_referrers(self.loop))) + + def test_close(self): + tr = unix_events._UnixWritePipeTransport( +@@ -816,7 +819,7 @@ + ignore_warnings = mock.patch.object(log.logger, "warning") + + def setUp(self): +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + self.running = False + self.zombies = {} + +@@ -1374,7 +1377,7 @@ + + # attach a new loop + old_loop = self.loop +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + patch = mock.patch.object + + with patch(old_loop, "remove_signal_handler") as m_old_remove, \ +@@ -1429,7 +1432,7 @@ + self.assertFalse(callback3.called) + + # attach a new loop +- self.loop = test_utils.TestLoop() ++ self.loop = self.new_test_loop() + + with mock.patch.object( + self.loop, "add_signal_handler") as m_add_signal_handler: +@@ -1487,12 +1490,12 @@ + self.assertFalse(self.watcher._zombies) + + +-class SafeChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): ++class SafeChildWatcherTests (ChildWatcherTestsMixin, test_utils.TestCase): + def create_watcher(self): + return asyncio.SafeChildWatcher() + + +-class FastChildWatcherTests (ChildWatcherTestsMixin, unittest.TestCase): ++class FastChildWatcherTests (ChildWatcherTestsMixin, test_utils.TestCase): + def create_watcher(self): + return asyncio.FastChildWatcher() + +diff -r c0e311e010fc Lib/test/test_asyncio/test_windows_events.py +--- a/Lib/test/test_asyncio/test_windows_events.py ++++ b/Lib/test/test_asyncio/test_windows_events.py +@@ -9,6 +9,7 @@ + + import asyncio + from asyncio import _overlapped ++from asyncio import test_utils + from asyncio import windows_events + + +@@ -26,15 +27,11 @@ + self.trans.close() + + +-class ProactorTests(unittest.TestCase): ++class ProactorTests(test_utils.TestCase): + + def setUp(self): + self.loop = asyncio.ProactorEventLoop() +- asyncio.set_event_loop(None) +- +- def tearDown(self): +- self.loop.close() +- self.loop = None ++ self.set_event_loop(self.loop) + + def test_close(self): + a, b = self.loop._socketpair() +@@ -97,38 +94,48 @@ + event = _overlapped.CreateEvent(None, True, False, None) + self.addCleanup(_winapi.CloseHandle, event) + +- # Wait for unset event with 0.2s timeout; ++ # Wait for unset event with 0.5s timeout; + # result should be False at timeout +- f = self.loop._proactor.wait_for_handle(event, 0.2) ++ fut = self.loop._proactor.wait_for_handle(event, 0.5) + start = self.loop.time() +- self.loop.run_until_complete(f) ++ self.loop.run_until_complete(fut) + elapsed = self.loop.time() - start +- self.assertFalse(f.result()) +- self.assertTrue(0.18 < elapsed < 0.9, elapsed) ++ self.assertFalse(fut.result()) ++ self.assertTrue(0.48 < elapsed < 0.9, elapsed) + + _overlapped.SetEvent(event) + + # Wait for for set event; + # result should be True immediately +- f = self.loop._proactor.wait_for_handle(event, 10) ++ fut = self.loop._proactor.wait_for_handle(event, 10) + start = self.loop.time() +- self.loop.run_until_complete(f) ++ self.loop.run_until_complete(fut) + elapsed = self.loop.time() - start +- self.assertTrue(f.result()) +- self.assertTrue(0 <= elapsed < 0.1, elapsed) ++ self.assertTrue(fut.result()) ++ self.assertTrue(0 <= elapsed < 0.3, elapsed) + +- _overlapped.ResetEvent(event) ++ # Tulip issue #195: cancelling a done _WaitHandleFuture must not crash ++ fut.cancel() ++ ++ def test_wait_for_handle_cancel(self): ++ event = _overlapped.CreateEvent(None, True, False, None) ++ self.addCleanup(_winapi.CloseHandle, event) + + # Wait for unset event with a cancelled future; + # CancelledError should be raised immediately +- f = self.loop._proactor.wait_for_handle(event, 10) +- f.cancel() ++ fut = self.loop._proactor.wait_for_handle(event, 10) ++ fut.cancel() + start = self.loop.time() + with self.assertRaises(asyncio.CancelledError): +- self.loop.run_until_complete(f) ++ self.loop.run_until_complete(fut) + elapsed = self.loop.time() - start + self.assertTrue(0 <= elapsed < 0.1, elapsed) + ++ # Tulip issue #195: cancelling a _WaitHandleFuture twice must not crash ++ fut = self.loop._proactor.wait_for_handle(event) ++ fut.cancel() ++ fut.cancel() ++ + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/test/test_asyncio/test_windows_utils.py +--- a/Lib/test/test_asyncio/test_windows_utils.py ++++ b/Lib/test/test_asyncio/test_windows_utils.py +@@ -51,6 +51,15 @@ + self.assertRaises(ValueError, + windows_utils.socketpair, proto=1) + ++ @mock.patch('asyncio.windows_utils.socket') ++ def test_winsocketpair_close(self, m_socket): ++ m_socket.AF_INET = socket.AF_INET ++ m_socket.SOCK_STREAM = socket.SOCK_STREAM ++ sock = mock.Mock() ++ m_socket.socket.return_value = sock ++ sock.bind.side_effect = OSError ++ self.assertRaises(OSError, windows_utils.socketpair) ++ self.assertTrue(sock.close.called) + + + class PipeTests(unittest.TestCase): +@@ -155,6 +164,8 @@ + self.assertTrue(msg.upper().rstrip().startswith(out)) + self.assertTrue(b"stderr".startswith(err)) + ++ p.wait() ++ + + if __name__ == '__main__': + unittest.main() +diff -r c0e311e010fc Lib/test/test_asyncore.py +--- a/Lib/test/test_asyncore.py ++++ b/Lib/test/test_asyncore.py +@@ -5,14 +5,12 @@ + import socket + import sys + import time +-import warnings + import errno + import struct ++import warnings + + from test import support +-from test.support import TESTFN, run_unittest, unlink, HOST, HOSTv6 + from io import BytesIO +-from io import StringIO + + try: + import threading +@@ -94,7 +92,7 @@ + """Helper function to bind a socket according to its family.""" + if HAS_UNIX_SOCKETS and sock.family == socket.AF_UNIX: + # Make sure the path doesn't exist. +- unlink(addr) ++ support.unlink(addr) + sock.bind(addr) + + +@@ -257,40 +255,29 @@ + d = asyncore.dispatcher() + + # capture output of dispatcher.log() (to stderr) +- fp = StringIO() +- stderr = sys.stderr + l1 = "Lovely spam! Wonderful spam!" + l2 = "I don't like spam!" +- try: +- sys.stderr = fp ++ with support.captured_stderr() as stderr: + d.log(l1) + d.log(l2) +- finally: +- sys.stderr = stderr + +- lines = fp.getvalue().splitlines() ++ lines = stderr.getvalue().splitlines() + self.assertEqual(lines, ['log: %s' % l1, 'log: %s' % l2]) + + def test_log_info(self): + d = asyncore.dispatcher() + + # capture output of dispatcher.log_info() (to stdout via print) +- fp = StringIO() +- stdout = sys.stdout + l1 = "Have you got anything without spam?" + l2 = "Why can't she have egg bacon spam and sausage?" + l3 = "THAT'S got spam in it!" +- try: +- sys.stdout = fp ++ with support.captured_stdout() as stdout: + d.log_info(l1, 'EGGS') + d.log_info(l2) + d.log_info(l3, 'SPAM') +- finally: +- sys.stdout = stdout + +- lines = fp.getvalue().splitlines() ++ lines = stdout.getvalue().splitlines() + expected = ['EGGS: %s' % l1, 'info: %s' % l2, 'SPAM: %s' % l3] +- + self.assertEqual(lines, expected) + + def test_unhandled(self): +@@ -298,18 +285,13 @@ + d.ignore_log_types = () + + # capture output of dispatcher.log_info() (to stdout via print) +- fp = StringIO() +- stdout = sys.stdout +- try: +- sys.stdout = fp ++ with support.captured_stdout() as stdout: + d.handle_expt() + d.handle_read() + d.handle_write() + d.handle_connect() +- finally: +- sys.stdout = stdout + +- lines = fp.getvalue().splitlines() ++ lines = stdout.getvalue().splitlines() + expected = ['warning: unhandled incoming priority event', + 'warning: unhandled read event', + 'warning: unhandled write event', +@@ -378,7 +360,7 @@ + data = b"Suppose there isn't a 16-ton weight?" + d = dispatcherwithsend_noread() + d.create_socket() +- d.connect((HOST, port)) ++ d.connect((support.HOST, port)) + + # give time for socket to connect + time.sleep(0.1) +@@ -410,14 +392,14 @@ + class FileWrapperTest(unittest.TestCase): + def setUp(self): + self.d = b"It's not dead, it's sleeping!" +- with open(TESTFN, 'wb') as file: ++ with open(support.TESTFN, 'wb') as file: + file.write(self.d) + + def tearDown(self): +- unlink(TESTFN) ++ support.unlink(support.TESTFN) + + def test_recv(self): +- fd = os.open(TESTFN, os.O_RDONLY) ++ fd = os.open(support.TESTFN, os.O_RDONLY) + w = asyncore.file_wrapper(fd) + os.close(fd) + +@@ -431,20 +413,20 @@ + def test_send(self): + d1 = b"Come again?" + d2 = b"I want to buy some cheese." +- fd = os.open(TESTFN, os.O_WRONLY | os.O_APPEND) ++ fd = os.open(support.TESTFN, os.O_WRONLY | os.O_APPEND) + w = asyncore.file_wrapper(fd) + os.close(fd) + + w.write(d1) + w.send(d2) + w.close() +- with open(TESTFN, 'rb') as file: ++ with open(support.TESTFN, 'rb') as file: + self.assertEqual(file.read(), self.d + d1 + d2) + + @unittest.skipUnless(hasattr(asyncore, 'file_dispatcher'), + 'asyncore.file_dispatcher required') + def test_dispatcher(self): +- fd = os.open(TESTFN, os.O_RDONLY) ++ fd = os.open(support.TESTFN, os.O_RDONLY) + data = [] + class FileDispatcher(asyncore.file_dispatcher): + def handle_read(self): +@@ -454,6 +436,22 @@ + asyncore.loop(timeout=0.01, use_poll=True, count=2) + self.assertEqual(b"".join(data), self.d) + ++ def test_resource_warning(self): ++ # Issue #11453 ++ fd = os.open(support.TESTFN, os.O_RDONLY) ++ f = asyncore.file_wrapper(fd) ++ with support.check_warnings(('', ResourceWarning)): ++ f = None ++ support.gc_collect() ++ ++ def test_close_twice(self): ++ fd = os.open(support.TESTFN, os.O_RDONLY) ++ f = asyncore.file_wrapper(fd) ++ f.close() ++ self.assertEqual(f.fd, -1) ++ # calling close twice should not fail ++ f.close() ++ + + class BaseTestHandler(asyncore.dispatcher): + +@@ -815,12 +813,12 @@ + + class TestAPI_UseIPv4Sockets(BaseTestAPI): + family = socket.AF_INET +- addr = (HOST, 0) ++ addr = (support.HOST, 0) + + @unittest.skipUnless(support.IPV6_ENABLED, 'IPv6 support required') + class TestAPI_UseIPv6Sockets(BaseTestAPI): + family = socket.AF_INET6 +- addr = (HOSTv6, 0) ++ addr = (support.HOSTv6, 0) + + @unittest.skipUnless(HAS_UNIX_SOCKETS, 'Unix sockets required') + class TestAPI_UseUnixSockets(BaseTestAPI): +@@ -829,7 +827,7 @@ + addr = support.TESTFN + + def tearDown(self): +- unlink(self.addr) ++ support.unlink(self.addr) + BaseTestAPI.tearDown(self) + + class TestAPI_UseIPv4Select(TestAPI_UseIPv4Sockets, unittest.TestCase): +diff -r c0e311e010fc Lib/test/test_cmd.py +--- a/Lib/test/test_cmd.py ++++ b/Lib/test/test_cmd.py +@@ -229,7 +229,7 @@ + trace = support.import_module('trace') + tracer=trace.Trace(ignoredirs=[sys.base_prefix, sys.base_exec_prefix,], + trace=0, count=1) +- tracer.run('reload(cmd);test_main()') ++ tracer.run('import importlib; importlib.reload(cmd); test_main()') + r=tracer.results() + print("Writing coverage results...") + r.write_results(show_missing=True, summary=True, coverdir=coverdir) +diff -r c0e311e010fc Lib/test/test_codecmaps_cn.py +--- a/Lib/test/test_codecmaps_cn.py ++++ b/Lib/test/test_codecmaps_cn.py +@@ -25,8 +25,5 @@ + 'trunk/charset/data/xml/gb-18030-2000.xml' + + +-def test_main(): +- support.run_unittest(__name__) +- + if __name__ == "__main__": +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_codecmaps_hk.py +--- a/Lib/test/test_codecmaps_hk.py ++++ b/Lib/test/test_codecmaps_hk.py +@@ -12,9 +12,5 @@ + encoding = 'big5hkscs' + mapfileurl = 'http://people.freebsd.org/~perky/i18n/BIG5HKSCS-2004.TXT' + +-def test_main(): +- support.run_unittest(__name__) +- + if __name__ == "__main__": +- support.use_resources = ['urlfetch'] +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_codecmaps_jp.py +--- a/Lib/test/test_codecmaps_jp.py ++++ b/Lib/test/test_codecmaps_jp.py +@@ -59,8 +59,5 @@ + mapfileurl = 'http://people.freebsd.org/~perky/i18n/SHIFT_JISX0213.TXT' + + +-def test_main(): +- support.run_unittest(__name__) +- + if __name__ == "__main__": +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_codecmaps_kr.py +--- a/Lib/test/test_codecmaps_kr.py ++++ b/Lib/test/test_codecmaps_kr.py +@@ -36,8 +36,5 @@ + pass_enctest = [(b'\\', '\u20a9')] + pass_dectest = [(b'\\', '\u20a9')] + +-def test_main(): +- support.run_unittest(__name__) +- + if __name__ == "__main__": +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_codecmaps_tw.py +--- a/Lib/test/test_codecmaps_tw.py ++++ b/Lib/test/test_codecmaps_tw.py +@@ -26,8 +26,5 @@ + (b"\xFFxy", "replace", "\ufffdxy"), + ) + +-def test_main(): +- support.run_unittest(__name__) +- + if __name__ == "__main__": +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_collections.py +--- a/Lib/test/test_collections.py ++++ b/Lib/test/test_collections.py +@@ -720,14 +720,166 @@ + + cs = MyComparableSet() + ncs = MyNonComparableSet() ++ self.assertFalse(ncs < cs) ++ self.assertTrue(ncs <= cs) ++ self.assertFalse(ncs > cs) ++ self.assertTrue(ncs >= cs) ++ ++ def assertSameSet(self, s1, s2): ++ # coerce both to a real set then check equality ++ self.assertSetEqual(set(s1), set(s2)) ++ ++ def test_Set_interoperability_with_real_sets(self): ++ # Issue: 8743 ++ class ListSet(Set): ++ def __init__(self, elements=()): ++ self.data = [] ++ for elem in elements: ++ if elem not in self.data: ++ self.data.append(elem) ++ def __contains__(self, elem): ++ return elem in self.data ++ def __iter__(self): ++ return iter(self.data) ++ def __len__(self): ++ return len(self.data) ++ def __repr__(self): ++ return 'Set({!r})'.format(self.data) ++ ++ r1 = set('abc') ++ r2 = set('bcd') ++ r3 = set('abcde') ++ f1 = ListSet('abc') ++ f2 = ListSet('bcd') ++ f3 = ListSet('abcde') ++ l1 = list('abccba') ++ l2 = list('bcddcb') ++ l3 = list('abcdeedcba') ++ ++ target = r1 & r2 ++ self.assertSameSet(f1 & f2, target) ++ self.assertSameSet(f1 & r2, target) ++ self.assertSameSet(r2 & f1, target) ++ self.assertSameSet(f1 & l2, target) ++ ++ target = r1 | r2 ++ self.assertSameSet(f1 | f2, target) ++ self.assertSameSet(f1 | r2, target) ++ self.assertSameSet(r2 | f1, target) ++ self.assertSameSet(f1 | l2, target) ++ ++ fwd_target = r1 - r2 ++ rev_target = r2 - r1 ++ self.assertSameSet(f1 - f2, fwd_target) ++ self.assertSameSet(f2 - f1, rev_target) ++ self.assertSameSet(f1 - r2, fwd_target) ++ self.assertSameSet(f2 - r1, rev_target) ++ self.assertSameSet(r1 - f2, fwd_target) ++ self.assertSameSet(r2 - f1, rev_target) ++ self.assertSameSet(f1 - l2, fwd_target) ++ self.assertSameSet(f2 - l1, rev_target) ++ ++ target = r1 ^ r2 ++ self.assertSameSet(f1 ^ f2, target) ++ self.assertSameSet(f1 ^ r2, target) ++ self.assertSameSet(r2 ^ f1, target) ++ self.assertSameSet(f1 ^ l2, target) ++ ++ # Don't change the following to use assertLess or other ++ # "more specific" unittest assertions. The current ++ # assertTrue/assertFalse style makes the pattern of test ++ # case combinations clear and allows us to know for sure ++ # the exact operator being invoked. ++ ++ # proper subset ++ self.assertTrue(f1 < f3) ++ self.assertFalse(f1 < f1) ++ self.assertFalse(f1 < f2) ++ self.assertTrue(r1 < f3) ++ self.assertFalse(r1 < f1) ++ self.assertFalse(r1 < f2) ++ self.assertTrue(r1 < r3) ++ self.assertFalse(r1 < r1) ++ self.assertFalse(r1 < r2) + with self.assertRaises(TypeError): +- ncs < cs ++ f1 < l3 + with self.assertRaises(TypeError): +- ncs <= cs ++ f1 < l1 + with self.assertRaises(TypeError): +- cs > ncs ++ f1 < l2 ++ ++ # any subset ++ self.assertTrue(f1 <= f3) ++ self.assertTrue(f1 <= f1) ++ self.assertFalse(f1 <= f2) ++ self.assertTrue(r1 <= f3) ++ self.assertTrue(r1 <= f1) ++ self.assertFalse(r1 <= f2) ++ self.assertTrue(r1 <= r3) ++ self.assertTrue(r1 <= r1) ++ self.assertFalse(r1 <= r2) + with self.assertRaises(TypeError): +- cs >= ncs ++ f1 <= l3 ++ with self.assertRaises(TypeError): ++ f1 <= l1 ++ with self.assertRaises(TypeError): ++ f1 <= l2 ++ ++ # proper superset ++ self.assertTrue(f3 > f1) ++ self.assertFalse(f1 > f1) ++ self.assertFalse(f2 > f1) ++ self.assertTrue(r3 > r1) ++ self.assertFalse(f1 > r1) ++ self.assertFalse(f2 > r1) ++ self.assertTrue(r3 > r1) ++ self.assertFalse(r1 > r1) ++ self.assertFalse(r2 > r1) ++ with self.assertRaises(TypeError): ++ f1 > l3 ++ with self.assertRaises(TypeError): ++ f1 > l1 ++ with self.assertRaises(TypeError): ++ f1 > l2 ++ ++ # any superset ++ self.assertTrue(f3 >= f1) ++ self.assertTrue(f1 >= f1) ++ self.assertFalse(f2 >= f1) ++ self.assertTrue(r3 >= r1) ++ self.assertTrue(f1 >= r1) ++ self.assertFalse(f2 >= r1) ++ self.assertTrue(r3 >= r1) ++ self.assertTrue(r1 >= r1) ++ self.assertFalse(r2 >= r1) ++ with self.assertRaises(TypeError): ++ f1 >= l3 ++ with self.assertRaises(TypeError): ++ f1 >=l1 ++ with self.assertRaises(TypeError): ++ f1 >= l2 ++ ++ # equality ++ self.assertTrue(f1 == f1) ++ self.assertTrue(r1 == f1) ++ self.assertTrue(f1 == r1) ++ self.assertFalse(f1 == f3) ++ self.assertFalse(r1 == f3) ++ self.assertFalse(f1 == r3) ++ self.assertFalse(f1 == l3) ++ self.assertFalse(f1 == l1) ++ self.assertFalse(f1 == l2) ++ ++ # inequality ++ self.assertFalse(f1 != f1) ++ self.assertFalse(r1 != f1) ++ self.assertFalse(f1 != r1) ++ self.assertTrue(f1 != f3) ++ self.assertTrue(r1 != f3) ++ self.assertTrue(f1 != r3) ++ self.assertTrue(f1 != l3) ++ self.assertTrue(f1 != l1) ++ self.assertTrue(f1 != l2) + + def test_Mapping(self): + for sample in [dict]: +diff -r c0e311e010fc Lib/test/test_compile.py +--- a/Lib/test/test_compile.py ++++ b/Lib/test/test_compile.py +@@ -1,3 +1,4 @@ ++import math + import unittest + import sys + import _ast +@@ -501,8 +502,43 @@ + check_limit("a", "*a") + + +-def test_main(): +- support.run_unittest(TestSpecifics) ++class TestStackSize(unittest.TestCase): ++ # These tests check that the computed stack size for a code object ++ # stays within reasonable bounds (see issue #21523 for an example ++ # dysfunction). ++ N = 100 ++ ++ def check_stack_size(self, code): ++ # To assert that the alleged stack size is not O(N), we ++ # check that it is smaller than log(N). ++ if isinstance(code, str): ++ code = compile(code, "", "single") ++ max_size = math.ceil(math.log(len(code.co_code))) ++ self.assertLessEqual(code.co_stacksize, max_size) ++ ++ def test_and(self): ++ self.check_stack_size("x and " * self.N + "x") ++ ++ def test_or(self): ++ self.check_stack_size("x or " * self.N + "x") ++ ++ def test_and_or(self): ++ self.check_stack_size("x and x or " * self.N + "x") ++ ++ def test_chained_comparison(self): ++ self.check_stack_size("x < " * self.N + "x") ++ ++ def test_if_else(self): ++ self.check_stack_size("x if x else " * self.N + "x") ++ ++ def test_binop(self): ++ self.check_stack_size("x + " * self.N + "x") ++ ++ def test_func_and(self): ++ code = "def f(x):\n" ++ code += " x and x\n" * self.N ++ self.check_stack_size(code) ++ + + if __name__ == "__main__": +- test_main() ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_decimal.py +--- a/Lib/test/test_decimal.py ++++ b/Lib/test/test_decimal.py +@@ -5429,7 +5429,7 @@ + all_tests.insert(0, CheckAttributes) + + +-def test_main(arith=False, verbose=None, todo_tests=None, debug=None): ++def test_main(arith=None, verbose=None, todo_tests=None, debug=None): + """ Execute the tests. + + Runs all arithmetic tests if arith is True or if the "decimal" resource +@@ -5439,7 +5439,7 @@ + init(C) + init(P) + global TEST_ALL, DEBUG +- TEST_ALL = arith or is_resource_enabled('decimal') ++ TEST_ALL = arith if arith is not None else is_resource_enabled('decimal') + DEBUG = debug + + if todo_tests is None: +diff -r c0e311e010fc Lib/test/test_descr.py +--- a/Lib/test/test_descr.py ++++ b/Lib/test/test_descr.py +@@ -1149,7 +1149,7 @@ + except (TypeError, UnicodeEncodeError): + pass + else: +- raise TestFailed("[chr(128)] slots not caught") ++ self.fail("[chr(128)] slots not caught") + + # Test leaks + class Counted(object): +diff -r c0e311e010fc Lib/test/test_difflib.py +--- a/Lib/test/test_difflib.py ++++ b/Lib/test/test_difflib.py +@@ -76,6 +76,15 @@ + diff_gen = difflib.unified_diff([], []) + self.assertRaises(StopIteration, next, diff_gen) + ++ def test_matching_blocks_cache(self): ++ # Issue #21635 ++ s = difflib.SequenceMatcher(None, "abxcd", "abcd") ++ first = s.get_matching_blocks() ++ second = s.get_matching_blocks() ++ self.assertEqual(second[0].size, 2) ++ self.assertEqual(second[1].size, 2) ++ self.assertEqual(second[2].size, 0) ++ + def test_added_tab_hint(self): + # Check fix for bug #1488943 + diff = list(difflib.Differ().compare(["\tI am a buggy"],["\t\tI am a bug"])) +diff -r c0e311e010fc Lib/test/test_email/__init__.py +--- a/Lib/test/test_email/__init__.py ++++ b/Lib/test/test_email/__init__.py +@@ -1,31 +1,16 @@ + import os + import sys + import unittest +-import test.support + import collections + import email + from email.message import Message + from email._policybase import compat32 ++from test.support import load_package_tests + from test.test_email import __file__ as landmark + +-# Run all tests in package for '-m unittest test.test_email' +-def load_tests(loader, standard_tests, pattern): +- this_dir = os.path.dirname(__file__) +- if pattern is None: +- pattern = "test*" +- package_tests = loader.discover(start_dir=this_dir, pattern=pattern) +- standard_tests.addTests(package_tests) +- return standard_tests +- +- +-# used by regrtest and __main__. +-def test_main(): +- here = os.path.dirname(__file__) +- # Unittest mucks with the path, so we have to save and restore +- # it to keep regrtest happy. +- savepath = sys.path[:] +- test.support._run_suite(unittest.defaultTestLoader.discover(here)) +- sys.path[:] = savepath ++# Load all tests in package ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) + + + # helper code used by a number of test modules. +diff -r c0e311e010fc Lib/test/test_email/__main__.py +--- a/Lib/test/test_email/__main__.py ++++ b/Lib/test/test_email/__main__.py +@@ -1,3 +1,4 @@ +-from test.test_email import test_main ++from test.test_email import load_tests ++import unittest + +-test_main() ++unittest.main() +diff -r c0e311e010fc Lib/test/test_email/test_email.py +--- a/Lib/test/test_email/test_email.py ++++ b/Lib/test/test_email/test_email.py +@@ -3390,6 +3390,31 @@ + self.assertIsInstance(msg.get_payload(), str) + self.assertIsInstance(msg.get_payload(decode=True), bytes) + ++ def test_bytes_parser_does_not_close_file(self): ++ with openfile('msg_02.txt', 'rb') as fp: ++ email.parser.BytesParser().parse(fp) ++ self.assertFalse(fp.closed) ++ ++ def test_bytes_parser_on_exception_does_not_close_file(self): ++ with openfile('msg_15.txt', 'rb') as fp: ++ bytesParser = email.parser.BytesParser ++ self.assertRaises(email.errors.StartBoundaryNotFoundDefect, ++ bytesParser(policy=email.policy.strict).parse, ++ fp) ++ self.assertFalse(fp.closed) ++ ++ def test_parser_does_not_close_file(self): ++ with openfile('msg_02.txt', 'r') as fp: ++ email.parser.Parser().parse(fp) ++ self.assertFalse(fp.closed) ++ ++ def test_parser_on_exception_does_not_close_file(self): ++ with openfile('msg_15.txt', 'r') as fp: ++ parser = email.parser.Parser ++ self.assertRaises(email.errors.StartBoundaryNotFoundDefect, ++ parser(policy=email.policy.strict).parse, fp) ++ self.assertFalse(fp.closed) ++ + def test_whitespace_continuation(self): + eq = self.assertEqual + # This message contains a line after the Subject: header that has only +diff -r c0e311e010fc Lib/test/test_enum.py +--- a/Lib/test/test_enum.py ++++ b/Lib/test/test_enum.py +@@ -1528,9 +1528,7 @@ + helper = pydoc.Helper(output=output) + helper(self.Color) + result = output.getvalue().strip() +- if result != expected_text: +- print_diffs(expected_text, result) +- self.fail("outputs are not equal, see diff above") ++ self.assertEqual(result, expected_text) + + def test_inspect_getmembers(self): + values = dict(( +diff -r c0e311e010fc Lib/test/test_frame.py +--- a/Lib/test/test_frame.py ++++ b/Lib/test/test_frame.py +@@ -1,5 +1,6 @@ + import gc + import sys ++import types + import unittest + import weakref + +@@ -109,6 +110,57 @@ + self.assertIs(None, wr()) + + ++class FrameLocalsTest(unittest.TestCase): ++ """ ++ Tests for the .f_locals attribute. ++ """ ++ ++ def make_frames(self): ++ def outer(): ++ x = 5 ++ y = 6 ++ def inner(): ++ z = x + 2 ++ 1/0 ++ t = 9 ++ return inner() ++ try: ++ outer() ++ except ZeroDivisionError as e: ++ tb = e.__traceback__ ++ frames = [] ++ while tb: ++ frames.append(tb.tb_frame) ++ tb = tb.tb_next ++ return frames ++ ++ def test_locals(self): ++ f, outer, inner = self.make_frames() ++ outer_locals = outer.f_locals ++ self.assertIsInstance(outer_locals.pop('inner'), types.FunctionType) ++ self.assertEqual(outer_locals, {'x': 5, 'y': 6}) ++ inner_locals = inner.f_locals ++ self.assertEqual(inner_locals, {'x': 5, 'z': 7}) ++ ++ def test_clear_locals(self): ++ # Test f_locals after clear() (issue #21897) ++ f, outer, inner = self.make_frames() ++ outer.clear() ++ inner.clear() ++ self.assertEqual(outer.f_locals, {}) ++ self.assertEqual(inner.f_locals, {}) ++ ++ def test_locals_clear_locals(self): ++ # Test f_locals before and after clear() (to exercise caching) ++ f, outer, inner = self.make_frames() ++ outer.f_locals ++ inner.f_locals ++ outer.clear() ++ inner.clear() ++ self.assertEqual(outer.f_locals, {}) ++ self.assertEqual(inner.f_locals, {}) ++ ++ + def test_main(): + support.run_unittest(__name__) + +diff -r c0e311e010fc Lib/test/test_gettext.py +--- a/Lib/test/test_gettext.py ++++ b/Lib/test/test_gettext.py +@@ -77,7 +77,7 @@ + def tearDown(self): + self.env.__exit__() + del self.env +- shutil.rmtree(os.path.split(LOCALEDIR)[0]) ++ support.rmtree(os.path.split(LOCALEDIR)[0]) + + + class GettextTestCase1(GettextBaseTest): +diff -r c0e311e010fc Lib/test/test_grammar.py +--- a/Lib/test/test_grammar.py ++++ b/Lib/test/test_grammar.py +@@ -80,6 +80,12 @@ + x = .3e14 + x = 3.1e4 + ++ def test_float_exponent_tokenization(self): ++ # See issue 21642. ++ self.assertEqual(1 if 1else 0, 1) ++ self.assertEqual(1 if 0else 0, 0) ++ self.assertRaises(SyntaxError, eval, "0 if 1Else 0") ++ + def test_string_literals(self): + x = ''; y = ""; self.assertTrue(len(x) == 0 and x == y) + x = '\''; y = "'"; self.assertTrue(len(x) == 1 and x == y and ord(x) == 39) +@@ -384,6 +390,31 @@ + check_syntax_error(self, "x + 1 = 1") + check_syntax_error(self, "a + 1 = b + 2") + ++ # Check the heuristic for print & exec covers significant cases ++ # As well as placing some limits on false positives ++ def test_former_statements_refer_to_builtins(self): ++ keywords = "print", "exec" ++ # Cases where we want the custom error ++ cases = [ ++ "{} foo", ++ "{} {{1:foo}}", ++ "if 1: {} foo", ++ "if 1: {} {{1:foo}}", ++ "if 1:\n {} foo", ++ "if 1:\n {} {{1:foo}}", ++ ] ++ for keyword in keywords: ++ custom_msg = "call to '{}'".format(keyword) ++ for case in cases: ++ source = case.format(keyword) ++ with self.subTest(source=source): ++ with self.assertRaisesRegex(SyntaxError, custom_msg): ++ exec(source) ++ source = source.replace("foo", "(foo.)") ++ with self.subTest(source=source): ++ with self.assertRaisesRegex(SyntaxError, "invalid syntax"): ++ exec(source) ++ + def test_del_stmt(self): + # 'del' exprlist + abc = [1,2,3] +diff -r c0e311e010fc Lib/test/test_httpservers.py +--- a/Lib/test/test_httpservers.py ++++ b/Lib/test/test_httpservers.py +@@ -125,7 +125,7 @@ + + def test_request_line_trimming(self): + self.con._http_vsn_str = 'HTTP/1.1\n' +- self.con.putrequest('GET', '/') ++ self.con.putrequest('XYZBOGUS', '/') + self.con.endheaders() + res = self.con.getresponse() + self.assertEqual(res.status, 501) +@@ -152,8 +152,9 @@ + self.assertEqual(res.status, 501) + + def test_version_none(self): ++ # Test that a valid method is rejected when not HTTP/1.x + self.con._http_vsn_str = '' +- self.con.putrequest('PUT', '/') ++ self.con.putrequest('CUSTOM', '/') + self.con.endheaders() + res = self.con.getresponse() + self.assertEqual(res.status, 400) +@@ -345,10 +346,13 @@ + self.cwd = os.getcwd() + self.parent_dir = tempfile.mkdtemp() + self.cgi_dir = os.path.join(self.parent_dir, 'cgi-bin') ++ self.cgi_child_dir = os.path.join(self.cgi_dir, 'child-dir') + os.mkdir(self.cgi_dir) ++ os.mkdir(self.cgi_child_dir) + self.nocgi_path = None + self.file1_path = None + self.file2_path = None ++ self.file3_path = None + + # The shebang line should be pure ASCII: use symlink if possible. + # See issue #7668. +@@ -382,6 +386,11 @@ + file2.write(cgi_file2 % self.pythonexe) + os.chmod(self.file2_path, 0o777) + ++ self.file3_path = os.path.join(self.cgi_child_dir, 'file3.py') ++ with open(self.file3_path, 'w', encoding='utf-8') as file3: ++ file3.write(cgi_file1 % self.pythonexe) ++ os.chmod(self.file3_path, 0o777) ++ + os.chdir(self.parent_dir) + + def tearDown(self): +@@ -395,6 +404,9 @@ + os.remove(self.file1_path) + if self.file2_path: + os.remove(self.file2_path) ++ if self.file3_path: ++ os.remove(self.file3_path) ++ os.rmdir(self.cgi_child_dir) + os.rmdir(self.cgi_dir) + os.rmdir(self.parent_dir) + finally: +@@ -485,6 +497,16 @@ + (res.read(), res.getheader('Content-type'), res.status)) + self.assertEqual(os.environ['SERVER_SOFTWARE'], signature) + ++ def test_urlquote_decoding_in_cgi_check(self): ++ res = self.request('/cgi-bin%2ffile1.py') ++ self.assertEqual((b'Hello World' + self.linesep, 'text/html', 200), ++ (res.read(), res.getheader('Content-type'), res.status)) ++ ++ def test_nested_cgi_path_issue21323(self): ++ res = self.request('/cgi-bin/child-dir/file3.py') ++ self.assertEqual((b'Hello World' + self.linesep, 'text/html', 200), ++ (res.read(), res.getheader('Content-type'), res.status)) ++ + + class SocketlessRequestHandler(SimpleHTTPRequestHandler): + def __init__(self): +diff -r c0e311e010fc Lib/test/test_idle.py +--- a/Lib/test/test_idle.py ++++ b/Lib/test/test_idle.py +@@ -13,8 +13,4 @@ + load_tests = idletest.load_tests + + if __name__ == '__main__': +- # Until unittest supports resources, we emulate regrtest's -ugui +- # so loaded tests run the same as if textually present here. +- # If any Idle test ever needs another resource, add it to the list. +- support.use_resources = ['gui'] # use_resources is initially None + unittest.main(verbosity=2, exit=False) +diff -r c0e311e010fc Lib/test/test_imaplib.py +--- a/Lib/test/test_imaplib.py ++++ b/Lib/test/test_imaplib.py +@@ -501,5 +501,4 @@ + + + if __name__ == "__main__": +- support.use_resources = ['network'] + unittest.main() +diff -r c0e311e010fc Lib/test/test_import.py +--- a/Lib/test/test_import.py ++++ b/Lib/test/test_import.py +@@ -190,12 +190,12 @@ + # import x.y.z binds x in the current namespace + import test as x + import test.support +- self.assertTrue(x is test, x.__name__) ++ self.assertIs(x, test, x.__name__) + self.assertTrue(hasattr(test.support, "__file__")) + + # import x.y.z as w binds z as w + import test.support as y +- self.assertTrue(y is test.support, y.__name__) ++ self.assertIs(y, test.support, y.__name__) + + def test_failing_reload(self): + # A failing reload should leave the module object in sys.modules. +@@ -223,7 +223,7 @@ + self.assertRaises(ZeroDivisionError, importlib.reload, mod) + # But we still expect the module to be in sys.modules. + mod = sys.modules.get(TESTFN) +- self.assertIsNot(mod, None, "expected module to be in sys.modules") ++ self.assertIsNotNone(mod, "expected module to be in sys.modules") + + # We should have replaced a w/ 10, but the old b value should + # stick. +diff -r c0e311e010fc Lib/test/test_importlib/__init__.py +--- a/Lib/test/test_importlib/__init__.py ++++ b/Lib/test/test_importlib/__init__.py +@@ -1,33 +1,5 @@ + import os +-import sys +-from test import support +-import unittest ++from test.support import load_package_tests + +-def test_suite(package=__package__, directory=os.path.dirname(__file__)): +- suite = unittest.TestSuite() +- for name in os.listdir(directory): +- if name.startswith(('.', '__')): +- continue +- path = os.path.join(directory, name) +- if (os.path.isfile(path) and name.startswith('test_') and +- name.endswith('.py')): +- submodule_name = os.path.splitext(name)[0] +- module_name = "{0}.{1}".format(package, submodule_name) +- __import__(module_name, level=0) +- module_tests = unittest.findTestCases(sys.modules[module_name]) +- suite.addTest(module_tests) +- elif os.path.isdir(path): +- package_name = "{0}.{1}".format(package, name) +- __import__(package_name, level=0) +- package_tests = getattr(sys.modules[package_name], 'test_suite')() +- suite.addTest(package_tests) +- else: +- continue +- return suite +- +- +-def test_main(): +- start_dir = os.path.dirname(__file__) +- top_dir = os.path.dirname(os.path.dirname(start_dir)) +- test_loader = unittest.TestLoader() +- support.run_unittest(test_loader.discover(start_dir, top_level_dir=top_dir)) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/__main__.py +--- a/Lib/test/test_importlib/__main__.py ++++ b/Lib/test/test_importlib/__main__.py +@@ -1,9 +1,4 @@ +-"""Run importlib's test suite. ++from . import load_tests ++import unittest + +-Specifying the ``--builtin`` flag will run tests, where applicable, with +-builtins.__import__ instead of importlib.__import__. +- +-""" +-if __name__ == '__main__': +- from . import test_main +- test_main() ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/builtin/__init__.py +--- a/Lib/test/test_importlib/builtin/__init__.py ++++ b/Lib/test/test_importlib/builtin/__init__.py +@@ -1,12 +1,5 @@ +-from .. import test_suite + import os ++from test.support import load_package_tests + +- +-def test_suite(): +- directory = os.path.dirname(__file__) +- return test_suite('importlib.test.builtin', directory) +- +- +-if __name__ == '__main__': +- from test.support import run_unittest +- run_unittest(test_suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/builtin/__main__.py +--- /dev/null ++++ b/Lib/test/test_importlib/builtin/__main__.py +@@ -0,0 +1,4 @@ ++from . import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/builtin/test_loader.py +--- a/Lib/test/test_importlib/builtin/test_loader.py ++++ b/Lib/test/test_importlib/builtin/test_loader.py +@@ -87,7 +87,7 @@ + def test_is_package(self): + # Cannot be a package. + result = self.machinery.BuiltinImporter.is_package(builtin_util.NAME) +- self.assertTrue(not result) ++ self.assertFalse(result) + + def test_not_builtin(self): + # Modules not built-in should raise ImportError. +diff -r c0e311e010fc Lib/test/test_importlib/extension/__init__.py +--- a/Lib/test/test_importlib/extension/__init__.py ++++ b/Lib/test/test_importlib/extension/__init__.py +@@ -1,13 +1,5 @@ +-from .. import test_suite +-import os.path +-import unittest ++import os ++from test.support import load_package_tests + +- +-def test_suite(): +- directory = os.path.dirname(__file__) +- return test_suite('importlib.test.extension', directory) +- +- +-if __name__ == '__main__': +- from test.support import run_unittest +- run_unittest(test_suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/extension/__main__.py +--- /dev/null ++++ b/Lib/test/test_importlib/extension/__main__.py +@@ -0,0 +1,4 @@ ++from . import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/frozen/__init__.py +--- a/Lib/test/test_importlib/frozen/__init__.py ++++ b/Lib/test/test_importlib/frozen/__init__.py +@@ -1,13 +1,5 @@ +-from .. import test_suite +-import os.path +-import unittest ++import os ++from test.support import load_package_tests + +- +-def test_suite(): +- directory = os.path.dirname(__file__) +- return test_suite('importlib.test.frozen', directory) +- +- +-if __name__ == '__main__': +- from test.support import run_unittest +- run_unittest(test_suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/frozen/__main__.py +--- /dev/null ++++ b/Lib/test/test_importlib/frozen/__main__.py +@@ -0,0 +1,4 @@ ++from . import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/import_/__init__.py +--- a/Lib/test/test_importlib/import_/__init__.py ++++ b/Lib/test/test_importlib/import_/__init__.py +@@ -1,13 +1,5 @@ +-from .. import test_suite +-import os.path +-import unittest ++import os ++from test.support import load_package_tests + +- +-def test_suite(): +- directory = os.path.dirname(__file__) +- return test_suite('importlib.test.import_', directory) +- +- +-if __name__ == '__main__': +- from test.support import run_unittest +- run_unittest(test_suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/import_/__main__.py +--- /dev/null ++++ b/Lib/test/test_importlib/import_/__main__.py +@@ -0,0 +1,4 @@ ++from . import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/import_/test_fromlist.py +--- a/Lib/test/test_importlib/import_/test_fromlist.py ++++ b/Lib/test/test_importlib/import_/test_fromlist.py +@@ -61,7 +61,7 @@ + with util.import_state(meta_path=[importer]): + module = self.__import__('module', fromlist=['non_existent']) + self.assertEqual(module.__name__, 'module') +- self.assertTrue(not hasattr(module, 'non_existent')) ++ self.assertFalse(hasattr(module, 'non_existent')) + + def test_module_from_package(self): + # [module] +diff -r c0e311e010fc Lib/test/test_importlib/import_/test_meta_path.py +--- a/Lib/test/test_importlib/import_/test_meta_path.py ++++ b/Lib/test/test_importlib/import_/test_meta_path.py +@@ -96,7 +96,7 @@ + args = log[1][0] + kwargs = log[1][1] + # Assuming all arguments are positional. +- self.assertTrue(not kwargs) ++ self.assertFalse(kwargs) + self.assertEqual(args[0], mod_name) + self.assertIs(args[1], path) + +diff -r c0e311e010fc Lib/test/test_importlib/source/__init__.py +--- a/Lib/test/test_importlib/source/__init__.py ++++ b/Lib/test/test_importlib/source/__init__.py +@@ -1,13 +1,5 @@ +-from .. import test_suite +-import os.path +-import unittest ++import os ++from test.support import load_package_tests + +- +-def test_suite(): +- directory = os.path.dirname(__file__) +- return test.test_suite('importlib.test.source', directory) +- +- +-if __name__ == '__main__': +- from test.support import run_unittest +- run_unittest(test_suite()) ++def load_tests(*args): ++ return load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_importlib/source/__main__.py +--- /dev/null ++++ b/Lib/test/test_importlib/source/__main__.py +@@ -0,0 +1,4 @@ ++from . import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_importlib/test_abc.py +--- a/Lib/test/test_importlib/test_abc.py ++++ b/Lib/test/test_importlib/test_abc.py +@@ -783,7 +783,7 @@ + warnings.simplefilter('ignore', DeprecationWarning) + module = self.loader.load_module(self.name) + self.verify_module(module) +- self.assertTrue(not hasattr(module, '__path__')) ++ self.assertFalse(hasattr(module, '__path__')) + + def test_get_source_encoding(self): + # Source is considered encoded in UTF-8 by default unless otherwise +diff -r c0e311e010fc Lib/test/test_inspect.py +--- a/Lib/test/test_inspect.py ++++ b/Lib/test/test_inspect.py +@@ -3048,6 +3048,13 @@ + self.assertEqual(lines[:-1], inspect.getsource(module).splitlines()) + self.assertEqual(err, b'') + ++ def test_custom_getattr(self): ++ def foo(): ++ pass ++ foo.__signature__ = 42 ++ with self.assertRaises(TypeError): ++ inspect.signature(foo) ++ + @unittest.skipIf(ThreadPoolExecutor is None, + 'threads required to test __qualname__ for source files') + def test_qualname_source(self): +diff -r c0e311e010fc Lib/test/test_io.py +--- a/Lib/test/test_io.py ++++ b/Lib/test/test_io.py +@@ -653,6 +653,20 @@ + fileio.close() + f2.readline() + ++ def test_nonbuffered_textio(self): ++ with warnings.catch_warnings(record=True) as recorded: ++ with self.assertRaises(ValueError): ++ self.open(support.TESTFN, 'w', buffering=0) ++ support.gc_collect() ++ self.assertEqual(recorded, []) ++ ++ def test_invalid_newline(self): ++ with warnings.catch_warnings(record=True) as recorded: ++ with self.assertRaises(ValueError): ++ self.open(support.TESTFN, 'w', newline='invalid') ++ support.gc_collect() ++ self.assertEqual(recorded, []) ++ + + class CIOTest(IOTest): + +@@ -792,9 +806,27 @@ + with self.assertRaises(OSError) as err: # exception not swallowed + b.close() + self.assertEqual(err.exception.args, ('close',)) ++ self.assertIsInstance(err.exception.__context__, OSError) + self.assertEqual(err.exception.__context__.args, ('flush',)) + self.assertFalse(b.closed) + ++ def test_nonnormalized_close_error_on_close(self): ++ # Issue #21677 ++ raw = self.MockRawIO() ++ def bad_flush(): ++ raise non_existing_flush ++ def bad_close(): ++ raise non_existing_close ++ raw.close = bad_close ++ b = self.tp(raw) ++ b.flush = bad_flush ++ with self.assertRaises(NameError) as err: # exception not swallowed ++ b.close() ++ self.assertIn('non_existing_close', str(err.exception)) ++ self.assertIsInstance(err.exception.__context__, NameError) ++ self.assertIn('non_existing_flush', str(err.exception.__context__)) ++ self.assertFalse(b.closed) ++ + def test_multi_close(self): + raw = self.MockRawIO() + b = self.tp(raw) +@@ -2576,6 +2608,39 @@ + self.assertRaises(OSError, txt.close) # exception not swallowed + self.assertTrue(txt.closed) + ++ def test_close_error_on_close(self): ++ buffer = self.BytesIO(self.testdata) ++ def bad_flush(): ++ raise OSError('flush') ++ def bad_close(): ++ raise OSError('close') ++ buffer.close = bad_close ++ txt = self.TextIOWrapper(buffer, encoding="ascii") ++ txt.flush = bad_flush ++ with self.assertRaises(OSError) as err: # exception not swallowed ++ txt.close() ++ self.assertEqual(err.exception.args, ('close',)) ++ self.assertIsInstance(err.exception.__context__, OSError) ++ self.assertEqual(err.exception.__context__.args, ('flush',)) ++ self.assertFalse(txt.closed) ++ ++ def test_nonnormalized_close_error_on_close(self): ++ # Issue #21677 ++ buffer = self.BytesIO(self.testdata) ++ def bad_flush(): ++ raise non_existing_flush ++ def bad_close(): ++ raise non_existing_close ++ buffer.close = bad_close ++ txt = self.TextIOWrapper(buffer, encoding="ascii") ++ txt.flush = bad_flush ++ with self.assertRaises(NameError) as err: # exception not swallowed ++ txt.close() ++ self.assertIn('non_existing_close', str(err.exception)) ++ self.assertIsInstance(err.exception.__context__, NameError) ++ self.assertIn('non_existing_flush', str(err.exception.__context__)) ++ self.assertFalse(txt.closed) ++ + def test_multi_close(self): + txt = self.TextIOWrapper(self.BytesIO(self.testdata), encoding="ascii") + txt.close() +diff -r c0e311e010fc Lib/test/test_itertools.py +--- a/Lib/test/test_itertools.py ++++ b/Lib/test/test_itertools.py +@@ -967,6 +967,12 @@ + self.assertEqual(take(2, copy.deepcopy(c)), list('a' * 2)) + self.pickletest(repeat(object='a', times=10)) + ++ def test_repeat_with_negative_times(self): ++ self.assertEqual(repr(repeat('a', -1)), "repeat('a', 0)") ++ self.assertEqual(repr(repeat('a', -2)), "repeat('a', 0)") ++ self.assertEqual(repr(repeat('a', times=-1)), "repeat('a', 0)") ++ self.assertEqual(repr(repeat('a', times=-2)), "repeat('a', 0)") ++ + def test_map(self): + self.assertEqual(list(map(operator.pow, range(3), range(1,7))), + [0**1, 1**2, 2**3]) +@@ -1741,8 +1747,15 @@ + + def test_repeat(self): + self.assertEqual(operator.length_hint(repeat(None, 50)), 50) ++ self.assertEqual(operator.length_hint(repeat(None, 0)), 0) + self.assertEqual(operator.length_hint(repeat(None), 12), 12) + ++ def test_repeat_with_negative_times(self): ++ self.assertEqual(operator.length_hint(repeat(None, -1)), 0) ++ self.assertEqual(operator.length_hint(repeat(None, -2)), 0) ++ self.assertEqual(operator.length_hint(repeat(None, times=-1)), 0) ++ self.assertEqual(operator.length_hint(repeat(None, times=-2)), 0) ++ + class RegressionTests(unittest.TestCase): + + def test_sf_793826(self): +diff -r c0e311e010fc Lib/test/test_json/__init__.py +--- a/Lib/test/test_json/__init__.py ++++ b/Lib/test/test_json/__init__.py +@@ -42,23 +42,12 @@ + '_json') + + +-here = os.path.dirname(__file__) +- +-def load_tests(*args): +- suite = additional_tests() +- loader = unittest.TestLoader() +- for fn in os.listdir(here): +- if fn.startswith("test") and fn.endswith(".py"): +- modname = "test.test_json." + fn[:-3] +- __import__(modname) +- module = sys.modules[modname] +- suite.addTests(loader.loadTestsFromModule(module)) +- return suite +- +-def additional_tests(): ++def load_tests(loader, _, pattern): + suite = unittest.TestSuite() + for mod in (json, json.encoder, json.decoder): + suite.addTest(doctest.DocTestSuite(mod)) + suite.addTest(TestPyTest('test_pyjson')) + suite.addTest(TestCTest('test_cjson')) +- return suite ++ ++ pkg_dir = os.path.dirname(__file__) ++ return support.load_package_tests(pkg_dir, loader, suite, pattern) +diff -r c0e311e010fc Lib/test/test_logging.py +--- a/Lib/test/test_logging.py ++++ b/Lib/test/test_logging.py +@@ -865,9 +865,6 @@ + super(TestTCPServer, self).server_bind() + self.port = self.socket.getsockname()[1] + +- class TestUnixStreamServer(TestTCPServer): +- address_family = socket.AF_UNIX +- + class TestUDPServer(ControlMixin, ThreadingUDPServer): + """ + A UDP server which is controllable using :class:`ControlMixin`. +@@ -915,8 +912,12 @@ + super(TestUDPServer, self).server_close() + self._closed = True + +- class TestUnixDatagramServer(TestUDPServer): +- address_family = socket.AF_UNIX ++ if hasattr(socket, "AF_UNIX"): ++ class TestUnixStreamServer(TestTCPServer): ++ address_family = socket.AF_UNIX ++ ++ class TestUnixDatagramServer(TestUDPServer): ++ address_family = socket.AF_UNIX + + # - end of server_helper section + +@@ -1457,12 +1458,13 @@ + os.remove(fn) + return fn + ++@unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") + @unittest.skipUnless(threading, 'Threading required for this test.') + class UnixSocketHandlerTest(SocketHandlerTest): + + """Test for SocketHandler with unix sockets.""" + +- if threading: ++ if threading and hasattr(socket, "AF_UNIX"): + server_class = TestUnixStreamServer + + def setUp(self): +@@ -1528,13 +1530,13 @@ + self.handled.wait() + self.assertEqual(self.log_output, "spam\neggs\n") + +- ++@unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") + @unittest.skipUnless(threading, 'Threading required for this test.') + class UnixDatagramHandlerTest(DatagramHandlerTest): + + """Test for DatagramHandler using Unix sockets.""" + +- if threading: ++ if threading and hasattr(socket, "AF_UNIX"): + server_class = TestUnixDatagramServer + + def setUp(self): +@@ -1603,13 +1605,13 @@ + self.handled.wait() + self.assertEqual(self.log_output, b'<11>h\xc3\xa4m-sp\xc3\xa4m') + +- ++@unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") + @unittest.skipUnless(threading, 'Threading required for this test.') + class UnixSysLogHandlerTest(SysLogHandlerTest): + + """Test for SysLogHandler with Unix sockets.""" + +- if threading: ++ if threading and hasattr(socket, "AF_UNIX"): + server_class = TestUnixDatagramServer + + def setUp(self): +diff -r c0e311e010fc Lib/test/test_minidom.py +--- a/Lib/test/test_minidom.py ++++ b/Lib/test/test_minidom.py +@@ -1531,6 +1531,13 @@ + num_children_after = len(doc.childNodes) + self.assertTrue(num_children_after == num_children_before - 1) + ++ def testProcessingInstructionNameError(self): ++ # wrong variable in .nodeValue property will ++ # lead to "NameError: name 'data' is not defined" ++ doc = parse(tstfile) ++ pi = doc.createProcessingInstruction("y", "z") ++ pi.nodeValue = "crash" ++ + def test_main(): + run_unittest(MinidomTest) + +diff -r c0e311e010fc Lib/test/test_modulefinder.py +--- a/Lib/test/test_modulefinder.py ++++ b/Lib/test/test_modulefinder.py +@@ -245,11 +245,12 @@ + + + class ModuleFinderTest(unittest.TestCase): +- def _do_test(self, info, report=False): ++ def _do_test(self, info, report=False, debug=0, replace_paths=[]): + import_this, modules, missing, maybe_missing, source = info + create_package(source) + try: +- mf = modulefinder.ModuleFinder(path=TEST_PATH) ++ mf = modulefinder.ModuleFinder(path=TEST_PATH, debug=debug, ++ replace_paths=replace_paths) + mf.import_hook(import_this) + if report: + mf.report() +@@ -308,9 +309,16 @@ + os.remove(source_path) + self._do_test(bytecode_test) + ++ def test_replace_paths(self): ++ old_path = os.path.join(TEST_DIR, 'a', 'module.py') ++ new_path = os.path.join(TEST_DIR, 'a', 'spam.py') ++ with support.captured_stdout() as output: ++ self._do_test(maybe_test, debug=2, ++ replace_paths=[(old_path, new_path)]) ++ output = output.getvalue() ++ expected = "co_filename %r changed to %r" % (old_path, new_path) ++ self.assertIn(expected, output) + +-def test_main(): +- support.run_unittest(ModuleFinderTest) + + if __name__ == "__main__": + unittest.main() +diff -r c0e311e010fc Lib/test/test_multiprocessing_main_handling.py +--- a/Lib/test/test_multiprocessing_main_handling.py ++++ b/Lib/test/test_multiprocessing_main_handling.py +@@ -1,4 +1,8 @@ + # tests __main__ module handling in multiprocessing ++from test import support ++# Skip tests if _thread or _multiprocessing wasn't built. ++support.import_module('_thread') ++support.import_module('_multiprocessing') + + import importlib + import importlib.machinery +@@ -9,14 +13,11 @@ + import os.path + import py_compile + +-from test import support + from test.script_helper import ( + make_pkg, make_script, make_zip_pkg, make_zip_script, + assert_python_ok, assert_python_failure, temp_dir, + spawn_python, kill_python) + +-# Skip tests if _multiprocessing wasn't built. +-_multiprocessing = support.import_module('_multiprocessing') + # Look up which start methods are available to test + import multiprocessing + AVAILABLE_START_METHODS = set(multiprocessing.get_all_start_methods()) +diff -r c0e311e010fc Lib/test/test_ntpath.py +--- a/Lib/test/test_ntpath.py ++++ b/Lib/test/test_ntpath.py +@@ -258,6 +258,41 @@ + check('%spam%bar', '%sbar' % nonascii) + check('%{}%bar'.format(nonascii), 'ham%sbar' % nonascii) + ++ def test_expanduser(self): ++ tester('ntpath.expanduser("test")', 'test') ++ ++ with support.EnvironmentVarGuard() as env: ++ env.clear() ++ tester('ntpath.expanduser("~test")', '~test') ++ ++ env['HOMEPATH'] = 'eric\\idle' ++ env['HOMEDRIVE'] = 'C:\\' ++ tester('ntpath.expanduser("~test")', 'C:\\eric\\test') ++ tester('ntpath.expanduser("~")', 'C:\\eric\\idle') ++ ++ del env['HOMEDRIVE'] ++ tester('ntpath.expanduser("~test")', 'eric\\test') ++ tester('ntpath.expanduser("~")', 'eric\\idle') ++ ++ env.clear() ++ env['USERPROFILE'] = 'C:\\eric\\idle' ++ tester('ntpath.expanduser("~test")', 'C:\\eric\\test') ++ tester('ntpath.expanduser("~")', 'C:\\eric\\idle') ++ ++ env.clear() ++ env['HOME'] = 'C:\\idle\\eric' ++ tester('ntpath.expanduser("~test")', 'C:\\idle\\test') ++ tester('ntpath.expanduser("~")', 'C:\\idle\\eric') ++ ++ tester('ntpath.expanduser("~test\\foo\\bar")', ++ 'C:\\idle\\test\\foo\\bar') ++ tester('ntpath.expanduser("~test/foo/bar")', ++ 'C:\\idle\\test/foo/bar') ++ tester('ntpath.expanduser("~\\foo\\bar")', ++ 'C:\\idle\\eric\\foo\\bar') ++ tester('ntpath.expanduser("~/foo/bar")', ++ 'C:\\idle\\eric/foo/bar') ++ + def test_abspath(self): + # ntpath.abspath() can only be used on a system with the "nt" module + # (reasonably), so we protect this test with "import nt". This allows +diff -r c0e311e010fc Lib/test/test_pathlib.py +--- a/Lib/test/test_pathlib.py ++++ b/Lib/test/test_pathlib.py +@@ -540,6 +540,10 @@ + self.assertRaises(ValueError, P('').with_name, 'd.xml') + self.assertRaises(ValueError, P('.').with_name, 'd.xml') + self.assertRaises(ValueError, P('/').with_name, 'd.xml') ++ self.assertRaises(ValueError, P('a/b').with_name, '') ++ self.assertRaises(ValueError, P('a/b').with_name, '/c') ++ self.assertRaises(ValueError, P('a/b').with_name, 'c/') ++ self.assertRaises(ValueError, P('a/b').with_name, 'c/d') + + def test_with_suffix_common(self): + P = self.cls +@@ -547,6 +551,9 @@ + self.assertEqual(P('/a/b').with_suffix('.gz'), P('/a/b.gz')) + self.assertEqual(P('a/b.py').with_suffix('.gz'), P('a/b.gz')) + self.assertEqual(P('/a/b.py').with_suffix('.gz'), P('/a/b.gz')) ++ # Stripping suffix ++ self.assertEqual(P('a/b.py').with_suffix(''), P('a/b')) ++ self.assertEqual(P('/a/b').with_suffix(''), P('/a/b')) + # Path doesn't have a "filename" component + self.assertRaises(ValueError, P('').with_suffix, '.gz') + self.assertRaises(ValueError, P('.').with_suffix, '.gz') +@@ -554,9 +561,12 @@ + # Invalid suffix + self.assertRaises(ValueError, P('a/b').with_suffix, 'gz') + self.assertRaises(ValueError, P('a/b').with_suffix, '/') ++ self.assertRaises(ValueError, P('a/b').with_suffix, '.') + self.assertRaises(ValueError, P('a/b').with_suffix, '/.gz') + self.assertRaises(ValueError, P('a/b').with_suffix, 'c/d') + self.assertRaises(ValueError, P('a/b').with_suffix, '.c/.d') ++ self.assertRaises(ValueError, P('a/b').with_suffix, './.d') ++ self.assertRaises(ValueError, P('a/b').with_suffix, '.d/.') + + def test_relative_to_common(self): + P = self.cls +@@ -950,6 +960,10 @@ + self.assertRaises(ValueError, P('c:').with_name, 'd.xml') + self.assertRaises(ValueError, P('c:/').with_name, 'd.xml') + self.assertRaises(ValueError, P('//My/Share').with_name, 'd.xml') ++ self.assertRaises(ValueError, P('c:a/b').with_name, 'd:') ++ self.assertRaises(ValueError, P('c:a/b').with_name, 'd:e') ++ self.assertRaises(ValueError, P('c:a/b').with_name, 'd:/e') ++ self.assertRaises(ValueError, P('c:a/b').with_name, '//My/Share') + + def test_with_suffix(self): + P = self.cls +@@ -1200,7 +1214,7 @@ + + def setUp(self): + os.mkdir(BASE) +- self.addCleanup(shutil.rmtree, BASE) ++ self.addCleanup(support.rmtree, BASE) + os.mkdir(join('dirA')) + os.mkdir(join('dirB')) + os.mkdir(join('dirC')) +@@ -1385,7 +1399,7 @@ + self._check_resolve_relative(p, P(BASE, 'dirB', 'fileB')) + # Now create absolute symlinks + d = tempfile.mkdtemp(suffix='-dirD') +- self.addCleanup(shutil.rmtree, d) ++ self.addCleanup(support.rmtree, d) + os.symlink(os.path.join(d), join('dirA', 'linkX')) + os.symlink(join('dirB'), os.path.join(d, 'linkY')) + p = P(BASE, 'dirA', 'linkX', 'linkY', 'fileB') +diff -r c0e311e010fc Lib/test/test_pkgutil.py +--- a/Lib/test/test_pkgutil.py ++++ b/Lib/test/test_pkgutil.py +@@ -363,6 +363,20 @@ + loader = pkgutil.get_loader(name) + self.assertIsNone(loader) + ++ def test_get_loader_None_in_sys_modules(self): ++ name = 'totally bogus' ++ sys.modules[name] = None ++ try: ++ loader = pkgutil.get_loader(name) ++ finally: ++ del sys.modules[name] ++ self.assertIsNone(loader) ++ ++ def test_find_loader_missing_module(self): ++ name = 'totally bogus' ++ loader = pkgutil.find_loader(name) ++ self.assertIsNone(loader) ++ + def test_find_loader_avoids_emulation(self): + with check_warnings() as w: + self.assertIsNotNone(pkgutil.find_loader("sys")) +diff -r c0e311e010fc Lib/test/test_plistlib.py +--- a/Lib/test/test_plistlib.py ++++ b/Lib/test/test_plistlib.py +@@ -207,6 +207,9 @@ + for fmt in ALL_FORMATS: + with self.subTest(fmt=fmt): + pl = self._create(fmt=fmt) ++ pl2 = plistlib.loads(TESTDATA[fmt], fmt=fmt) ++ self.assertEqual(dict(pl), dict(pl2), ++ "generated data was not identical to Apple's output") + pl2 = plistlib.loads(TESTDATA[fmt]) + self.assertEqual(dict(pl), dict(pl2), + "generated data was not identical to Apple's output") +@@ -217,6 +220,8 @@ + b = BytesIO() + pl = self._create(fmt=fmt) + plistlib.dump(pl, b, fmt=fmt) ++ pl2 = plistlib.load(BytesIO(b.getvalue()), fmt=fmt) ++ self.assertEqual(dict(pl), dict(pl2)) + pl2 = plistlib.load(BytesIO(b.getvalue())) + self.assertEqual(dict(pl), dict(pl2)) + +@@ -411,6 +416,18 @@ + pl2 = plistlib.loads(data) + self.assertEqual(dict(pl), dict(pl2)) + ++ def test_nonstandard_refs_size(self): ++ # Issue #21538: Refs and offsets are 24-bit integers ++ data = (b'bplist00' ++ b'\xd1\x00\x00\x01\x00\x00\x02QaQb' ++ b'\x00\x00\x08\x00\x00\x0f\x00\x00\x11' ++ b'\x00\x00\x00\x00\x00\x00' ++ b'\x03\x03' ++ b'\x00\x00\x00\x00\x00\x00\x00\x03' ++ b'\x00\x00\x00\x00\x00\x00\x00\x00' ++ b'\x00\x00\x00\x00\x00\x00\x00\x13') ++ self.assertEqual(plistlib.loads(data), {'a': 'b'}) ++ + + class TestPlistlibDeprecated(unittest.TestCase): + def test_io_deprecated(self): +diff -r c0e311e010fc Lib/test/test_posix.py +--- a/Lib/test/test_posix.py ++++ b/Lib/test/test_posix.py +@@ -757,7 +757,7 @@ + + @unittest.skipUnless(hasattr(os, 'getegid'), "test needs os.getegid()") + def test_getgroups(self): +- with os.popen('id -G') as idg: ++ with os.popen('id -G 2>/dev/null') as idg: + groups = idg.read().strip() + ret = idg.close() + +@@ -768,7 +768,7 @@ + if sys.platform == 'darwin': + import sysconfig + dt = sysconfig.get_config_var('MACOSX_DEPLOYMENT_TARGET') or '10.0' +- if float(dt) < 10.6: ++ if tuple(int(n) for n in dt.split('.')[0:2]) < (10, 6): + raise unittest.SkipTest("getgroups(2) is broken prior to 10.6") + + # 'id -G' and 'os.getgroups()' should return the same +diff -r c0e311e010fc Lib/test/test_pydoc.py +--- a/Lib/test/test_pydoc.py ++++ b/Lib/test/test_pydoc.py +@@ -14,6 +14,7 @@ + import time + import types + import unittest ++import urllib.parse + import xml.etree + import textwrap + from io import StringIO +@@ -47,6 +48,7 @@ + builtins.object + A + B ++ C + \x20\x20\x20\x20 + class A(builtins.object) + | Hello and goodbye +@@ -74,6 +76,26 @@ + | Data and other attributes defined here: + |\x20\x20 + | NO_MEANING = 'eggs' ++\x20\x20\x20\x20 ++ class C(builtins.object) ++ | Methods defined here: ++ |\x20\x20 ++ | get_answer(self) ++ | Return say_no() ++ |\x20\x20 ++ | is_it_true(self) ++ | Return self.get_answer() ++ |\x20\x20 ++ | say_no(self) ++ |\x20\x20 ++ | ---------------------------------------------------------------------- ++ | Data descriptors defined here: ++ |\x20\x20 ++ | __dict__ ++ | dictionary for instance variables (if defined) ++ |\x20\x20 ++ | __weakref__ ++ | list of weak references to the object (if defined) + + FUNCTIONS + doc_func() +@@ -124,6 +146,7 @@ +
+
A +
B ++
C +
+ + +@@ -165,6 +188,28 @@ + Data and other attributes defined here:
+
NO_MEANING = 'eggs'
+ ++

++ ++ ++ ++\x20\x20\x20\x20 ++ ++
 
++class C(builtins.object)
    Methods defined here:
++
get_answer(self)
Return say_no()
++ ++
is_it_true(self)
Return self.get_answer()
++ ++
say_no(self)
++ ++
++Data descriptors defined here:
++
__dict__
++
dictionary for instance variables (if defined)
++
++
__weakref__
++
list of weak references to the object (if defined)
++
+

+ + +@@ -358,14 +403,11 @@ + "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') ++ @requires_docstrings + def test_html_doc(self): + result, doc_loc = get_pydoc_html(pydoc_mod) + mod_file = inspect.getabsfile(pydoc_mod) +- if sys.platform == 'win32': +- import nturl2path +- mod_url = nturl2path.pathname2url(mod_file) +- else: +- mod_url = mod_file ++ mod_url = urllib.parse.quote(mod_file) + expected_html = expected_html_pattern % ( + (mod_url, mod_file, doc_loc) + + expected_html_data_docstrings) +@@ -377,6 +419,7 @@ + "Docstrings are omitted with -O2 and above") + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') ++ @requires_docstrings + def test_text_doc(self): + result, doc_loc = get_pydoc_text(pydoc_mod) + expected_text = expected_text_pattern % ( +@@ -402,6 +445,14 @@ + result, doc_loc = get_pydoc_text(xml.etree) + self.assertEqual(doc_loc, "", "MODULE DOCS incorrectly includes a link") + ++ def test_getpager_with_stdin_none(self): ++ previous_stdin = sys.stdin ++ try: ++ sys.stdin = None ++ pydoc.getpager() # Shouldn't fail. ++ finally: ++ sys.stdin = previous_stdin ++ + def test_non_str_name(self): + # issue14638 + # Treat illegal (non-str) name like no name +@@ -443,6 +494,7 @@ + 'Docstrings are omitted with -O2 and above') + @unittest.skipIf(hasattr(sys, 'gettrace') and sys.gettrace(), + 'trace function introduces __locals__ unexpectedly') ++ @requires_docstrings + def test_help_output_redirect(self): + # issue 940286, if output is set in Helper, then all output from + # Helper.help should be redirected +@@ -694,7 +746,7 @@ + try: + pydoc.render_doc(name) + except ImportError: +- self.fail('finding the doc of {!r} failed'.format(o)) ++ self.fail('finding the doc of {!r} failed'.format(name)) + + for name in ('notbuiltins', 'strrr', 'strr.translate', + 'str.trrrranslate', 'builtins.strrr', +diff -r c0e311e010fc Lib/test/test_random.py +--- a/Lib/test/test_random.py ++++ b/Lib/test/test_random.py +@@ -602,7 +602,7 @@ + for variate, args, expected in [ + (g.uniform, (10.0, 10.0), 10.0), + (g.triangular, (10.0, 10.0), 10.0), +- #(g.triangular, (10.0, 10.0, 10.0), 10.0), ++ (g.triangular, (10.0, 10.0, 10.0), 10.0), + (g.expovariate, (float('inf'),), 0.0), + (g.vonmisesvariate, (3.0, float('inf')), 3.0), + (g.gauss, (10.0, 0.0), 10.0), +diff -r c0e311e010fc Lib/test/test_readline.py +--- a/Lib/test/test_readline.py ++++ b/Lib/test/test_readline.py +@@ -1,17 +1,20 @@ + """ + Very minimal unittests for parts of the readline module. +- +-These tests were added to check that the libedit emulation on OSX and +-the "real" readline have the same interface for history manipulation. That's +-why the tests cover only a small subset of the interface. + """ ++import os + import unittest + from test.support import run_unittest, import_module ++from test.script_helper import assert_python_ok + + # Skip tests if there is no readline module + readline = import_module('readline') + + class TestHistoryManipulation (unittest.TestCase): ++ """ ++ These tests were added to check that the libedit emulation on OSX and the ++ "real" readline have the same interface for history manipulation. That's ++ why the tests cover only a small subset of the interface. ++ """ + + @unittest.skipIf(not hasattr(readline, 'clear_history'), + "The history update test cannot be run because the " +@@ -40,8 +43,18 @@ + self.assertEqual(readline.get_current_history_length(), 1) + + ++class TestReadline(unittest.TestCase): ++ def test_init(self): ++ # Issue #19884: Ensure that the ANSI sequence "\033[1034h" is not ++ # written into stdout when the readline module is imported and stdout ++ # is redirected to a pipe. ++ rc, stdout, stderr = assert_python_ok('-c', 'import readline', ++ TERM='xterm-256color') ++ self.assertEqual(stdout, b'') ++ ++ + def test_main(): +- run_unittest(TestHistoryManipulation) ++ run_unittest(TestHistoryManipulation, TestReadline) + + if __name__ == "__main__": + test_main() +diff -r c0e311e010fc Lib/test/test_robotparser.py +--- a/Lib/test/test_robotparser.py ++++ b/Lib/test/test_robotparser.py +@@ -4,6 +4,12 @@ + from urllib.error import URLError, HTTPError + from urllib.request import urlopen + from test import support ++from http.server import BaseHTTPRequestHandler, HTTPServer ++try: ++ import threading ++except ImportError: ++ threading = None ++ + + class RobotTestCase(unittest.TestCase): + def __init__(self, index=None, parser=None, url=None, good=None, agent=None): +@@ -247,33 +253,52 @@ + RobotTest(16, doc, good, bad) + + +-class NetworkTestCase(unittest.TestCase): ++class RobotHandler(BaseHTTPRequestHandler): ++ ++ def do_GET(self): ++ self.send_error(403, "Forbidden access") ++ ++ def log_message(self, format, *args): ++ pass ++ ++ ++@unittest.skipUnless(threading, 'threading required for this test') ++class PasswordProtectedSiteTestCase(unittest.TestCase): ++ ++ def setUp(self): ++ self.server = HTTPServer((support.HOST, 0), RobotHandler) ++ ++ self.t = threading.Thread( ++ name='HTTPServer serving', ++ target=self.server.serve_forever, ++ # Short poll interval to make the test finish quickly. ++ # Time between requests is short enough that we won't wake ++ # up spuriously too many times. ++ kwargs={'poll_interval':0.01}) ++ self.t.daemon = True # In case this function raises. ++ self.t.start() ++ ++ def tearDown(self): ++ self.server.shutdown() ++ self.t.join() ++ self.server.server_close() ++ ++ def runTest(self): ++ self.testPasswordProtectedSite() + + def testPasswordProtectedSite(self): +- support.requires('network') +- with support.transient_internet('mueblesmoraleda.com'): +- url = 'http://mueblesmoraleda.com' +- robots_url = url + "/robots.txt" +- # First check the URL is usable for our purposes, since the +- # test site is a bit flaky. +- try: +- urlopen(robots_url) +- except HTTPError as e: +- if e.code not in {401, 403}: +- self.skipTest( +- "%r should return a 401 or 403 HTTP error, not %r" +- % (robots_url, e.code)) +- else: +- self.skipTest( +- "%r should return a 401 or 403 HTTP error, not succeed" +- % (robots_url)) +- parser = urllib.robotparser.RobotFileParser() +- parser.set_url(url) +- try: +- parser.read() +- except URLError: +- self.skipTest('%s is unavailable' % url) +- self.assertEqual(parser.can_fetch("*", robots_url), False) ++ addr = self.server.server_address ++ url = 'http://' + support.HOST + ':' + str(addr[1]) ++ robots_url = url + "/robots.txt" ++ parser = urllib.robotparser.RobotFileParser() ++ parser.set_url(url) ++ parser.read() ++ self.assertFalse(parser.can_fetch("*", robots_url)) ++ ++ def __str__(self): ++ return '%s' % self.__class__.__name__ ++ ++class NetworkTestCase(unittest.TestCase): + + @unittest.skip('does not handle the gzip encoding delivered by pydotorg') + def testPythonOrg(self): +@@ -288,8 +313,8 @@ + def load_tests(loader, suite, pattern): + suite = unittest.makeSuite(NetworkTestCase) + suite.addTest(tests) ++ suite.addTest(PasswordProtectedSiteTestCase()) + return suite + + if __name__=='__main__': +- support.use_resources = ['network'] + unittest.main() +diff -r c0e311e010fc Lib/test/test_selectors.py +--- a/Lib/test/test_selectors.py ++++ b/Lib/test/test_selectors.py +@@ -378,7 +378,7 @@ + resource.setrlimit(resource.RLIMIT_NOFILE, (hard, hard)) + self.addCleanup(resource.setrlimit, resource.RLIMIT_NOFILE, + (soft, hard)) +- NUM_FDS = hard ++ NUM_FDS = min(hard, 2**16) + except (OSError, ValueError): + NUM_FDS = soft + +diff -r c0e311e010fc Lib/test/test_socket.py +--- a/Lib/test/test_socket.py ++++ b/Lib/test/test_socket.py +@@ -3,6 +3,7 @@ + + import errno + import io ++import itertools + import socket + import select + import tempfile +@@ -1145,17 +1146,24 @@ + sock.close() + + def test_getsockaddrarg(self): +- host = '0.0.0.0' ++ sock = socket.socket() ++ self.addCleanup(sock.close) + port = support.find_unused_port() + big_port = port + 65536 + neg_port = port - 65536 +- sock = socket.socket() +- try: +- self.assertRaises(OverflowError, sock.bind, (host, big_port)) +- self.assertRaises(OverflowError, sock.bind, (host, neg_port)) +- sock.bind((host, port)) +- finally: +- sock.close() ++ self.assertRaises(OverflowError, sock.bind, (HOST, big_port)) ++ self.assertRaises(OverflowError, sock.bind, (HOST, neg_port)) ++ # Since find_unused_port() is inherently subject to race conditions, we ++ # call it a couple times if necessary. ++ for i in itertools.count(): ++ port = support.find_unused_port() ++ try: ++ sock.bind((HOST, port)) ++ except OSError as e: ++ if e.errno != errno.EADDRINUSE or i == 5: ++ raise ++ else: ++ break + + @unittest.skipUnless(os.name == "nt", "Windows specific") + def test_sock_ioctl(self): +@@ -1456,6 +1464,7 @@ + + + @unittest.skipUnless(HAVE_SOCKET_CAN, 'SocketCan required for this test.') ++@unittest.skipUnless(thread, 'Threading required for this test.') + class CANTest(ThreadedCANSocketTest): + + def __init__(self, methodName='runTest'): +diff -r c0e311e010fc Lib/test/test_spwd.py +--- /dev/null ++++ b/Lib/test/test_spwd.py +@@ -0,0 +1,60 @@ ++import os ++import unittest ++from test import support ++ ++spwd = support.import_module('spwd') ++ ++ ++@unittest.skipUnless(hasattr(os, 'geteuid') and os.geteuid() == 0, ++ 'root privileges required') ++class TestSpwdRoot(unittest.TestCase): ++ ++ def test_getspall(self): ++ entries = spwd.getspall() ++ self.assertIsInstance(entries, list) ++ for entry in entries: ++ self.assertIsInstance(entry, spwd.struct_spwd) ++ ++ def test_getspnam(self): ++ entries = spwd.getspall() ++ if not entries: ++ self.skipTest('empty shadow password database') ++ random_name = entries[0].sp_namp ++ entry = spwd.getspnam(random_name) ++ self.assertIsInstance(entry, spwd.struct_spwd) ++ self.assertEqual(entry.sp_namp, random_name) ++ self.assertEqual(entry.sp_namp, entry[0]) ++ self.assertEqual(entry.sp_namp, entry.sp_nam) ++ self.assertIsInstance(entry.sp_pwdp, str) ++ self.assertEqual(entry.sp_pwdp, entry[1]) ++ self.assertEqual(entry.sp_pwdp, entry.sp_pwd) ++ self.assertIsInstance(entry.sp_lstchg, int) ++ self.assertEqual(entry.sp_lstchg, entry[2]) ++ self.assertIsInstance(entry.sp_min, int) ++ self.assertEqual(entry.sp_min, entry[3]) ++ self.assertIsInstance(entry.sp_max, int) ++ self.assertEqual(entry.sp_max, entry[4]) ++ self.assertIsInstance(entry.sp_warn, int) ++ self.assertEqual(entry.sp_warn, entry[5]) ++ self.assertIsInstance(entry.sp_inact, int) ++ self.assertEqual(entry.sp_inact, entry[6]) ++ self.assertIsInstance(entry.sp_expire, int) ++ self.assertEqual(entry.sp_expire, entry[7]) ++ self.assertIsInstance(entry.sp_flag, int) ++ self.assertEqual(entry.sp_flag, entry[8]) ++ with self.assertRaises(KeyError) as cx: ++ spwd.getspnam('invalid user name') ++ self.assertEqual(str(cx.exception), "'getspnam(): name not found'") ++ self.assertRaises(TypeError, spwd.getspnam) ++ self.assertRaises(TypeError, spwd.getspnam, 0) ++ self.assertRaises(TypeError, spwd.getspnam, random_name, 0) ++ try: ++ bytes_name = os.fsencode(random_name) ++ except UnicodeEncodeError: ++ pass ++ else: ++ self.assertRaises(TypeError, spwd.getspnam, bytes_name) ++ ++ ++if __name__ == "__main__": ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_ssl.py +--- a/Lib/test/test_ssl.py ++++ b/Lib/test/test_ssl.py +@@ -281,11 +281,11 @@ + # Some sanity checks follow + # >= 0.9 + self.assertGreaterEqual(n, 0x900000) +- # < 2.0 +- self.assertLess(n, 0x20000000) ++ # < 3.0 ++ self.assertLess(n, 0x30000000) + major, minor, fix, patch, status = t + self.assertGreaterEqual(major, 0) +- self.assertLess(major, 2) ++ self.assertLess(major, 3) + self.assertGreaterEqual(minor, 0) + self.assertLess(minor, 256) + self.assertGreaterEqual(fix, 0) +@@ -294,9 +294,13 @@ + self.assertLessEqual(patch, 26) + self.assertGreaterEqual(status, 0) + self.assertLessEqual(status, 15) +- # Version string as returned by OpenSSL, the format might change +- self.assertTrue(s.startswith("OpenSSL {:d}.{:d}.{:d}".format(major, minor, fix)), +- (s, t)) ++ # Version string as returned by {Open,Libre}SSL, the format might change ++ if "LibreSSL" in s: ++ self.assertTrue(s.startswith("LibreSSL {:d}.{:d}".format(major, minor)), ++ (s, t)) ++ else: ++ self.assertTrue(s.startswith("OpenSSL {:d}.{:d}.{:d}".format(major, minor, fix)), ++ (s, t)) + + @support.cpython_only + def test_refcycle(self): +diff -r c0e311e010fc Lib/test/test_subprocess.py +--- a/Lib/test/test_subprocess.py ++++ b/Lib/test/test_subprocess.py +@@ -1926,6 +1926,86 @@ + "Some fds not in pass_fds were left open") + self.assertIn(1, remaining_fds, "Subprocess failed") + ++ ++ @unittest.skipIf(sys.platform.startswith("freebsd") and ++ os.stat("/dev").st_dev == os.stat("/dev/fd").st_dev, ++ "Requires fdescfs mounted on /dev/fd on FreeBSD.") ++ def test_close_fds_when_max_fd_is_lowered(self): ++ """Confirm that issue21618 is fixed (may fail under valgrind).""" ++ fd_status = support.findfile("fd_status.py", subdir="subprocessdata") ++ ++ # This launches the meat of the test in a child process to ++ # avoid messing with the larger unittest processes maximum ++ # number of file descriptors. ++ # This process launches: ++ # +--> Process that lowers its RLIMIT_NOFILE aftr setting up ++ # a bunch of high open fds above the new lower rlimit. ++ # Those are reported via stdout before launching a new ++ # process with close_fds=False to run the actual test: ++ # +--> The TEST: This one launches a fd_status.py ++ # subprocess with close_fds=True so we can find out if ++ # any of the fds above the lowered rlimit are still open. ++ p = subprocess.Popen([sys.executable, '-c', textwrap.dedent( ++ ''' ++ import os, resource, subprocess, sys, textwrap ++ open_fds = set() ++ # Add a bunch more fds to pass down. ++ for _ in range(40): ++ fd = os.open("/dev/null", os.O_RDONLY) ++ open_fds.add(fd) ++ ++ # Leave a two pairs of low ones available for use by the ++ # internal child error pipe and the stdout pipe. ++ # We also leave 10 more open as some Python buildbots run into ++ # "too many open files" errors during the test if we do not. ++ for fd in sorted(open_fds)[:14]: ++ os.close(fd) ++ open_fds.remove(fd) ++ ++ for fd in open_fds: ++ #self.addCleanup(os.close, fd) ++ os.set_inheritable(fd, True) ++ ++ max_fd_open = max(open_fds) ++ ++ # Communicate the open_fds to the parent unittest.TestCase process. ++ print(','.join(map(str, sorted(open_fds)))) ++ sys.stdout.flush() ++ ++ rlim_cur, rlim_max = resource.getrlimit(resource.RLIMIT_NOFILE) ++ try: ++ # 29 is lower than the highest fds we are leaving open. ++ resource.setrlimit(resource.RLIMIT_NOFILE, (29, rlim_max)) ++ # Launch a new Python interpreter with our low fd rlim_cur that ++ # inherits open fds above that limit. It then uses subprocess ++ # with close_fds=True to get a report of open fds in the child. ++ # An explicit list of fds to check is passed to fd_status.py as ++ # letting fd_status rely on its default logic would miss the ++ # fds above rlim_cur as it normally only checks up to that limit. ++ subprocess.Popen( ++ [sys.executable, '-c', ++ textwrap.dedent(""" ++ import subprocess, sys ++ subprocess.Popen([sys.executable, %r] + ++ [str(x) for x in range({max_fd})], ++ close_fds=True).wait() ++ """.format(max_fd=max_fd_open+1))], ++ close_fds=False).wait() ++ finally: ++ resource.setrlimit(resource.RLIMIT_NOFILE, (rlim_cur, rlim_max)) ++ ''' % fd_status)], stdout=subprocess.PIPE) ++ ++ output, unused_stderr = p.communicate() ++ output_lines = output.splitlines() ++ self.assertEqual(len(output_lines), 2, ++ msg="expected exactly two lines of output:\n%r" % output) ++ opened_fds = set(map(int, output_lines[0].strip().split(b','))) ++ remaining_fds = set(map(int, output_lines[1].strip().split(b','))) ++ ++ self.assertFalse(remaining_fds & opened_fds, ++ msg="Some fds were left open.") ++ ++ + # Mac OS X Tiger (10.4) has a kernel bug: sometimes, the file + # descriptor of a pipe closed in the parent process is valid in the + # child process according to fstat(), but the mode of the file +diff -r c0e311e010fc Lib/test/test_tarfile.py +--- a/Lib/test/test_tarfile.py ++++ b/Lib/test/test_tarfile.py +@@ -352,10 +352,16 @@ + + + class MiscReadTestBase(CommonReadTest): ++ def requires_name_attribute(self): ++ pass ++ + def test_no_name_argument(self): ++ self.requires_name_attribute() + with open(self.tarname, "rb") as fobj: +- tar = tarfile.open(fileobj=fobj, mode=self.mode) +- self.assertEqual(tar.name, os.path.abspath(fobj.name)) ++ self.assertIsInstance(fobj.name, str) ++ with tarfile.open(fileobj=fobj, mode=self.mode) as tar: ++ self.assertIsInstance(tar.name, str) ++ self.assertEqual(tar.name, os.path.abspath(fobj.name)) + + def test_no_name_attribute(self): + with open(self.tarname, "rb") as fobj: +@@ -363,7 +369,7 @@ + fobj = io.BytesIO(data) + self.assertRaises(AttributeError, getattr, fobj, "name") + tar = tarfile.open(fileobj=fobj, mode=self.mode) +- self.assertEqual(tar.name, None) ++ self.assertIsNone(tar.name) + + def test_empty_name_attribute(self): + with open(self.tarname, "rb") as fobj: +@@ -371,7 +377,25 @@ + fobj = io.BytesIO(data) + fobj.name = "" + with tarfile.open(fileobj=fobj, mode=self.mode) as tar: +- self.assertEqual(tar.name, None) ++ self.assertIsNone(tar.name) ++ ++ def test_int_name_attribute(self): ++ # Issue 21044: tarfile.open() should handle fileobj with an integer ++ # 'name' attribute. ++ fd = os.open(self.tarname, os.O_RDONLY) ++ with open(fd, 'rb') as fobj: ++ self.assertIsInstance(fobj.name, int) ++ with tarfile.open(fileobj=fobj, mode=self.mode) as tar: ++ self.assertIsNone(tar.name) ++ ++ def test_bytes_name_attribute(self): ++ self.requires_name_attribute() ++ tarname = os.fsencode(self.tarname) ++ with open(tarname, 'rb') as fobj: ++ self.assertIsInstance(fobj.name, bytes) ++ with tarfile.open(fileobj=fobj, mode=self.mode) as tar: ++ self.assertIsInstance(tar.name, bytes) ++ self.assertEqual(tar.name, os.path.abspath(fobj.name)) + + def test_illegal_mode_arg(self): + with open(tmpname, 'wb'): +@@ -549,11 +573,11 @@ + pass + + class Bz2MiscReadTest(Bz2Test, MiscReadTestBase, unittest.TestCase): +- def test_no_name_argument(self): ++ def requires_name_attribute(self): + self.skipTest("BZ2File have no name attribute") + + class LzmaMiscReadTest(LzmaTest, MiscReadTestBase, unittest.TestCase): +- def test_no_name_argument(self): ++ def requires_name_attribute(self): + self.skipTest("LZMAFile have no name attribute") + + +diff -r c0e311e010fc Lib/test/test_tcl.py +--- a/Lib/test/test_tcl.py ++++ b/Lib/test/test_tcl.py +@@ -133,6 +133,50 @@ + tcl = self.interp + self.assertRaises(TclError,tcl.unsetvar,'a') + ++ def test_getint(self): ++ tcl = self.interp.tk ++ self.assertEqual(tcl.getint(' 42 '), 42) ++ self.assertEqual(tcl.getint(42), 42) ++ self.assertRaises(TypeError, tcl.getint) ++ self.assertRaises(TypeError, tcl.getint, '42', '10') ++ self.assertRaises(TypeError, tcl.getint, b'42') ++ self.assertRaises(TypeError, tcl.getint, 42.0) ++ self.assertRaises(TclError, tcl.getint, 'a') ++ self.assertRaises((TypeError, ValueError, TclError), ++ tcl.getint, '42\0') ++ self.assertRaises((UnicodeEncodeError, ValueError, TclError), ++ tcl.getint, '42\ud800') ++ ++ def test_getdouble(self): ++ tcl = self.interp.tk ++ self.assertEqual(tcl.getdouble(' 42 '), 42.0) ++ self.assertEqual(tcl.getdouble(' 42.5 '), 42.5) ++ self.assertEqual(tcl.getdouble(42.5), 42.5) ++ self.assertRaises(TypeError, tcl.getdouble) ++ self.assertRaises(TypeError, tcl.getdouble, '42.5', '10') ++ self.assertRaises(TypeError, tcl.getdouble, b'42.5') ++ self.assertRaises(TypeError, tcl.getdouble, 42) ++ self.assertRaises(TclError, tcl.getdouble, 'a') ++ self.assertRaises((TypeError, ValueError, TclError), ++ tcl.getdouble, '42.5\0') ++ self.assertRaises((UnicodeEncodeError, ValueError, TclError), ++ tcl.getdouble, '42.5\ud800') ++ ++ def test_getboolean(self): ++ tcl = self.interp.tk ++ self.assertIs(tcl.getboolean('on'), True) ++ self.assertIs(tcl.getboolean('1'), True) ++ self.assertEqual(tcl.getboolean(42), 42) ++ self.assertRaises(TypeError, tcl.getboolean) ++ self.assertRaises(TypeError, tcl.getboolean, 'on', '1') ++ self.assertRaises(TypeError, tcl.getboolean, b'on') ++ self.assertRaises(TypeError, tcl.getboolean, 1.0) ++ self.assertRaises(TclError, tcl.getboolean, 'a') ++ self.assertRaises((TypeError, ValueError, TclError), ++ tcl.getboolean, 'on\0') ++ self.assertRaises((UnicodeEncodeError, ValueError, TclError), ++ tcl.getboolean, 'on\ud800') ++ + def testEvalFile(self): + tcl = self.interp + with open(support.TESTFN, 'w') as f: +@@ -362,10 +406,9 @@ + self.assertEqual(passValue(float('inf')), float('inf')) + self.assertEqual(passValue(-float('inf')), -float('inf')) + else: +- f = float(passValue(float('nan'))) +- self.assertNotEqual(f, f) + self.assertEqual(float(passValue(float('inf'))), float('inf')) + self.assertEqual(float(passValue(-float('inf'))), -float('inf')) ++ # XXX NaN representation can be not parsable by float() + self.assertEqual(passValue((1, '2', (3.4,))), + (1, '2', (3.4,)) if self.wantobjects else '1 2 3.4') + +@@ -387,9 +430,6 @@ + expected = float(expected) + self.assertAlmostEqual(float(actual), expected, + delta=abs(expected) * 1e-10) +- def nan_eq(actual, expected): +- actual = float(actual) +- self.assertNotEqual(actual, actual) + + check(True, '1') + check(False, '0') +@@ -412,7 +452,7 @@ + check(f, f, eq=float_eq) + check(float('inf'), 'Inf', eq=float_eq) + check(-float('inf'), '-Inf', eq=float_eq) +- check(float('nan'), 'NaN', eq=nan_eq) ++ # XXX NaN representation can be not parsable by float() + check((), '') + check((1, (2,), (3, 4), '5 6', ()), '1 2 {3 4} {5 6} {}') + +@@ -513,10 +553,35 @@ + @support.cpython_only + @unittest.skipUnless(INT_MAX < PY_SSIZE_T_MAX, "needs UINT_MAX < SIZE_MAX") + @support.bigmemtest(size=INT_MAX + 1, memuse=5, dry_run=False) +- def test_huge_string(self, size): ++ def test_huge_string_call(self, size): + value = ' ' * size + self.assertRaises(OverflowError, self.interp.call, 'set', '_', value) + ++ @support.cpython_only ++ @unittest.skipUnless(INT_MAX < PY_SSIZE_T_MAX, "needs UINT_MAX < SIZE_MAX") ++ @support.bigmemtest(size=INT_MAX + 1, memuse=9, dry_run=False) ++ def test_huge_string_builtins(self, size): ++ value = '1' + ' ' * size ++ self.assertRaises(OverflowError, self.interp.tk.getint, value) ++ self.assertRaises(OverflowError, self.interp.tk.getdouble, value) ++ self.assertRaises(OverflowError, self.interp.tk.getboolean, value) ++ self.assertRaises(OverflowError, self.interp.eval, value) ++ self.assertRaises(OverflowError, self.interp.evalfile, value) ++ self.assertRaises(OverflowError, self.interp.record, value) ++ self.assertRaises(OverflowError, self.interp.adderrorinfo, value) ++ self.assertRaises(OverflowError, self.interp.setvar, value, 'x', 'a') ++ self.assertRaises(OverflowError, self.interp.setvar, 'x', value, 'a') ++ self.assertRaises(OverflowError, self.interp.unsetvar, value) ++ self.assertRaises(OverflowError, self.interp.unsetvar, 'x', value) ++ self.assertRaises(OverflowError, self.interp.adderrorinfo, value) ++ self.assertRaises(OverflowError, self.interp.exprstring, value) ++ self.assertRaises(OverflowError, self.interp.exprlong, value) ++ self.assertRaises(OverflowError, self.interp.exprboolean, value) ++ self.assertRaises(OverflowError, self.interp.splitlist, value) ++ self.assertRaises(OverflowError, self.interp.split, value) ++ self.assertRaises(OverflowError, self.interp.createcommand, value, max) ++ self.assertRaises(OverflowError, self.interp.deletecommand, value) ++ + + def setUpModule(): + if support.verbose: +diff -r c0e311e010fc Lib/test/test_tk.py +--- a/Lib/test/test_tk.py ++++ b/Lib/test/test_tk.py +@@ -10,15 +10,9 @@ + + from tkinter.test import runtktests + +-def test_main(enable_gui=False): +- if enable_gui: +- if support.use_resources is None: +- support.use_resources = ['gui'] +- elif 'gui' not in support.use_resources: +- support.use_resources.append('gui') +- ++def test_main(): + support.run_unittest( + *runtktests.get_tests(text=False, packages=['test_tkinter'])) + + if __name__ == '__main__': +- test_main(enable_gui=True) ++ test_main() +diff -r c0e311e010fc Lib/test/test_tools.py +--- a/Lib/test/test_tools.py ++++ /dev/null +@@ -1,465 +0,0 @@ +-"""Tests for scripts in the Tools directory. +- +-This file contains regression tests for some of the scripts found in the +-Tools directory of a Python checkout or tarball, such as reindent.py. +-""" +- +-import os +-import sys +-import importlib._bootstrap +-import importlib.machinery +-import unittest +-from unittest import mock +-import shutil +-import subprocess +-import sysconfig +-import tempfile +-import textwrap +-from test import support +-from test.script_helper import assert_python_ok, temp_dir +- +-if not sysconfig.is_python_build(): +- # XXX some installers do contain the tools, should we detect that +- # and run the tests in that case too? +- raise unittest.SkipTest('test irrelevant for an installed Python') +- +-basepath = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), +- 'Tools') +-scriptsdir = os.path.join(basepath, 'scripts') +- +- +-class ReindentTests(unittest.TestCase): +- script = os.path.join(scriptsdir, 'reindent.py') +- +- def test_noargs(self): +- assert_python_ok(self.script) +- +- def test_help(self): +- rc, out, err = assert_python_ok(self.script, '-h') +- self.assertEqual(out, b'') +- self.assertGreater(err, b'') +- +- +-class PindentTests(unittest.TestCase): +- script = os.path.join(scriptsdir, 'pindent.py') +- +- def assertFileEqual(self, fn1, fn2): +- with open(fn1) as f1, open(fn2) as f2: +- self.assertEqual(f1.readlines(), f2.readlines()) +- +- def pindent(self, source, *args): +- with subprocess.Popen( +- (sys.executable, self.script) + args, +- stdin=subprocess.PIPE, stdout=subprocess.PIPE, +- universal_newlines=True) as proc: +- out, err = proc.communicate(source) +- self.assertIsNone(err) +- return out +- +- def lstriplines(self, data): +- return '\n'.join(line.lstrip() for line in data.splitlines()) + '\n' +- +- def test_selftest(self): +- self.maxDiff = None +- with temp_dir() as directory: +- data_path = os.path.join(directory, '_test.py') +- with open(self.script) as f: +- closed = f.read() +- with open(data_path, 'w') as f: +- f.write(closed) +- +- rc, out, err = assert_python_ok(self.script, '-d', data_path) +- self.assertEqual(out, b'') +- self.assertEqual(err, b'') +- backup = data_path + '~' +- self.assertTrue(os.path.exists(backup)) +- with open(backup) as f: +- self.assertEqual(f.read(), closed) +- with open(data_path) as f: +- clean = f.read() +- compile(clean, '_test.py', 'exec') +- self.assertEqual(self.pindent(clean, '-c'), closed) +- self.assertEqual(self.pindent(closed, '-d'), clean) +- +- rc, out, err = assert_python_ok(self.script, '-c', data_path) +- self.assertEqual(out, b'') +- self.assertEqual(err, b'') +- with open(backup) as f: +- self.assertEqual(f.read(), clean) +- with open(data_path) as f: +- self.assertEqual(f.read(), closed) +- +- broken = self.lstriplines(closed) +- with open(data_path, 'w') as f: +- f.write(broken) +- rc, out, err = assert_python_ok(self.script, '-r', data_path) +- self.assertEqual(out, b'') +- self.assertEqual(err, b'') +- with open(backup) as f: +- self.assertEqual(f.read(), broken) +- with open(data_path) as f: +- indented = f.read() +- compile(indented, '_test.py', 'exec') +- self.assertEqual(self.pindent(broken, '-r'), indented) +- +- def pindent_test(self, clean, closed): +- self.assertEqual(self.pindent(clean, '-c'), closed) +- self.assertEqual(self.pindent(closed, '-d'), clean) +- broken = self.lstriplines(closed) +- self.assertEqual(self.pindent(broken, '-r', '-e', '-s', '4'), closed) +- +- def test_statements(self): +- clean = textwrap.dedent("""\ +- if a: +- pass +- +- if a: +- pass +- else: +- pass +- +- if a: +- pass +- elif: +- pass +- else: +- pass +- +- while a: +- break +- +- while a: +- break +- else: +- pass +- +- for i in a: +- break +- +- for i in a: +- break +- else: +- pass +- +- try: +- pass +- finally: +- pass +- +- try: +- pass +- except TypeError: +- pass +- except ValueError: +- pass +- else: +- pass +- +- try: +- pass +- except TypeError: +- pass +- except ValueError: +- pass +- finally: +- pass +- +- with a: +- pass +- +- class A: +- pass +- +- def f(): +- pass +- """) +- +- closed = textwrap.dedent("""\ +- if a: +- pass +- # end if +- +- if a: +- pass +- else: +- pass +- # end if +- +- if a: +- pass +- elif: +- pass +- else: +- pass +- # end if +- +- while a: +- break +- # end while +- +- while a: +- break +- else: +- pass +- # end while +- +- for i in a: +- break +- # end for +- +- for i in a: +- break +- else: +- pass +- # end for +- +- try: +- pass +- finally: +- pass +- # end try +- +- try: +- pass +- except TypeError: +- pass +- except ValueError: +- pass +- else: +- pass +- # end try +- +- try: +- pass +- except TypeError: +- pass +- except ValueError: +- pass +- finally: +- pass +- # end try +- +- with a: +- pass +- # end with +- +- class A: +- pass +- # end class A +- +- def f(): +- pass +- # end def f +- """) +- self.pindent_test(clean, closed) +- +- def test_multilevel(self): +- clean = textwrap.dedent("""\ +- def foobar(a, b): +- if a == b: +- a = a+1 +- elif a < b: +- b = b-1 +- if b > a: a = a-1 +- else: +- print 'oops!' +- """) +- closed = textwrap.dedent("""\ +- def foobar(a, b): +- if a == b: +- a = a+1 +- elif a < b: +- b = b-1 +- if b > a: a = a-1 +- # end if +- else: +- print 'oops!' +- # end if +- # end def foobar +- """) +- self.pindent_test(clean, closed) +- +- def test_preserve_indents(self): +- clean = textwrap.dedent("""\ +- if a: +- if b: +- pass +- """) +- closed = textwrap.dedent("""\ +- if a: +- if b: +- pass +- # end if +- # end if +- """) +- self.assertEqual(self.pindent(clean, '-c'), closed) +- self.assertEqual(self.pindent(closed, '-d'), clean) +- broken = self.lstriplines(closed) +- self.assertEqual(self.pindent(broken, '-r', '-e', '-s', '9'), closed) +- clean = textwrap.dedent("""\ +- if a: +- \tif b: +- \t\tpass +- """) +- closed = textwrap.dedent("""\ +- if a: +- \tif b: +- \t\tpass +- \t# end if +- # end if +- """) +- self.assertEqual(self.pindent(clean, '-c'), closed) +- self.assertEqual(self.pindent(closed, '-d'), clean) +- broken = self.lstriplines(closed) +- self.assertEqual(self.pindent(broken, '-r'), closed) +- +- def test_escaped_newline(self): +- clean = textwrap.dedent("""\ +- class\\ +- \\ +- A: +- def\ +- \\ +- f: +- pass +- """) +- closed = textwrap.dedent("""\ +- class\\ +- \\ +- A: +- def\ +- \\ +- f: +- pass +- # end def f +- # end class A +- """) +- self.assertEqual(self.pindent(clean, '-c'), closed) +- self.assertEqual(self.pindent(closed, '-d'), clean) +- +- def test_empty_line(self): +- clean = textwrap.dedent("""\ +- if a: +- +- pass +- """) +- closed = textwrap.dedent("""\ +- if a: +- +- pass +- # end if +- """) +- self.pindent_test(clean, closed) +- +- def test_oneline(self): +- clean = textwrap.dedent("""\ +- if a: pass +- """) +- closed = textwrap.dedent("""\ +- if a: pass +- # end if +- """) +- self.pindent_test(clean, closed) +- +- +-class TestSundryScripts(unittest.TestCase): +- # At least make sure the rest don't have syntax errors. When tests are +- # added for a script it should be added to the whitelist below. +- +- # scripts that have independent tests. +- whitelist = ['reindent.py', 'pdeps.py', 'gprof2html'] +- # scripts that can't be imported without running +- blacklist = ['make_ctype.py'] +- # scripts that use windows-only modules +- windows_only = ['win_add2path.py'] +- # blacklisted for other reasons +- other = ['analyze_dxp.py'] +- +- skiplist = blacklist + whitelist + windows_only + other +- +- def setUp(self): +- cm = support.DirsOnSysPath(scriptsdir) +- cm.__enter__() +- self.addCleanup(cm.__exit__) +- +- def test_sundry(self): +- for fn in os.listdir(scriptsdir): +- if fn.endswith('.py') and fn not in self.skiplist: +- __import__(fn[:-3]) +- +- @unittest.skipIf(sys.platform != "win32", "Windows-only test") +- def test_sundry_windows(self): +- for fn in self.windows_only: +- __import__(fn[:-3]) +- +- @unittest.skipIf(not support.threading, "test requires _thread module") +- def test_analyze_dxp_import(self): +- if hasattr(sys, 'getdxp'): +- import analyze_dxp +- else: +- with self.assertRaises(RuntimeError): +- import analyze_dxp +- +- +-class PdepsTests(unittest.TestCase): +- +- @classmethod +- def setUpClass(self): +- path = os.path.join(scriptsdir, 'pdeps.py') +- spec = importlib.util.spec_from_file_location('pdeps', path) +- self.pdeps = importlib._bootstrap._SpecMethods(spec).load() +- +- @classmethod +- def tearDownClass(self): +- if 'pdeps' in sys.modules: +- del sys.modules['pdeps'] +- +- def test_process_errors(self): +- # Issue #14492: m_import.match(line) can be None. +- with tempfile.TemporaryDirectory() as tmpdir: +- fn = os.path.join(tmpdir, 'foo') +- with open(fn, 'w') as stream: +- stream.write("#!/this/will/fail") +- self.pdeps.process(fn, {}) +- +- def test_inverse_attribute_error(self): +- # Issue #14492: this used to fail with an AttributeError. +- self.pdeps.inverse({'a': []}) +- +- +-class Gprof2htmlTests(unittest.TestCase): +- +- def setUp(self): +- path = os.path.join(scriptsdir, 'gprof2html.py') +- spec = importlib.util.spec_from_file_location('gprof2html', path) +- self.gprof = importlib._bootstrap._SpecMethods(spec).load() +- oldargv = sys.argv +- def fixup(): +- sys.argv = oldargv +- self.addCleanup(fixup) +- sys.argv = [] +- +- def test_gprof(self): +- # Issue #14508: this used to fail with an NameError. +- with mock.patch.object(self.gprof, 'webbrowser') as wmock, \ +- tempfile.TemporaryDirectory() as tmpdir: +- fn = os.path.join(tmpdir, 'abc') +- open(fn, 'w').close() +- sys.argv = ['gprof2html', fn] +- self.gprof.main() +- self.assertTrue(wmock.open.called) +- +- +-# Run the tests in Tools/parser/test_unparse.py +-with support.DirsOnSysPath(os.path.join(basepath, 'parser')): +- from test_unparse import UnparseTestCase +- from test_unparse import DirectoryTestCase +- +- +-def test_main(): +- support.run_unittest(*[obj for obj in globals().values() +- if isinstance(obj, type)]) +- +- +-if __name__ == '__main__': +- unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/__init__.py +--- /dev/null ++++ b/Lib/test/test_tools/__init__.py +@@ -0,0 +1,25 @@ ++"""Support functions for testing scripts in the Tools directory.""" ++import os ++import unittest ++import importlib ++from test import support ++from fnmatch import fnmatch ++ ++basepath = os.path.dirname( # ++ os.path.dirname( # Lib ++ os.path.dirname( # test ++ os.path.dirname(__file__)))) # test_tools ++ ++toolsdir = os.path.join(basepath, 'Tools') ++scriptsdir = os.path.join(toolsdir, 'scripts') ++ ++def skip_if_missing(): ++ if not os.path.isdir(scriptsdir): ++ raise unittest.SkipTest('scripts directory could not be found') ++ ++def import_tool(toolname): ++ with support.DirsOnSysPath(scriptsdir): ++ return importlib.import_module(toolname) ++ ++def load_tests(*args): ++ return support.load_package_tests(os.path.dirname(__file__), *args) +diff -r c0e311e010fc Lib/test/test_tools/__main__.py +--- /dev/null ++++ b/Lib/test/test_tools/__main__.py +@@ -0,0 +1,4 @@ ++from test.test_tools import load_tests ++import unittest ++ ++unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_gprof2html.py +--- /dev/null ++++ b/Lib/test/test_tools/test_gprof2html.py +@@ -0,0 +1,36 @@ ++"""Tests for the gprof2html script in the Tools directory.""" ++ ++import os ++import sys ++import importlib ++import unittest ++from unittest import mock ++import tempfile ++ ++from test.test_tools import scriptsdir, skip_if_missing, import_tool ++ ++skip_if_missing() ++ ++class Gprof2htmlTests(unittest.TestCase): ++ ++ def setUp(self): ++ self.gprof = import_tool('gprof2html') ++ oldargv = sys.argv ++ def fixup(): ++ sys.argv = oldargv ++ self.addCleanup(fixup) ++ sys.argv = [] ++ ++ def test_gprof(self): ++ # Issue #14508: this used to fail with an NameError. ++ with mock.patch.object(self.gprof, 'webbrowser') as wmock, \ ++ tempfile.TemporaryDirectory() as tmpdir: ++ fn = os.path.join(tmpdir, 'abc') ++ open(fn, 'w').close() ++ sys.argv = ['gprof2html', fn] ++ self.gprof.main() ++ self.assertTrue(wmock.open.called) ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_md5sum.py +--- /dev/null ++++ b/Lib/test/test_tools/test_md5sum.py +@@ -0,0 +1,77 @@ ++"""Tests for the md5sum script in the Tools directory.""" ++ ++import os ++import sys ++import unittest ++from test import support ++from test.script_helper import assert_python_ok, assert_python_failure ++ ++from test.test_tools import scriptsdir, import_tool, skip_if_missing ++ ++skip_if_missing() ++ ++class MD5SumTests(unittest.TestCase): ++ @classmethod ++ def setUpClass(cls): ++ cls.script = os.path.join(scriptsdir, 'md5sum.py') ++ os.mkdir(support.TESTFN) ++ cls.fodder = os.path.join(support.TESTFN, 'md5sum.fodder') ++ with open(cls.fodder, 'wb') as f: ++ f.write(b'md5sum\r\ntest file\r\n') ++ cls.fodder_md5 = b'd38dae2eb1ab346a292ef6850f9e1a0d' ++ cls.fodder_textmode_md5 = b'a8b07894e2ca3f2a4c3094065fa6e0a5' ++ ++ @classmethod ++ def tearDownClass(cls): ++ support.rmtree(support.TESTFN) ++ ++ def test_noargs(self): ++ rc, out, err = assert_python_ok(self.script) ++ self.assertEqual(rc, 0) ++ self.assertTrue( ++ out.startswith(b'd41d8cd98f00b204e9800998ecf8427e ')) ++ self.assertFalse(err) ++ ++ def test_checksum_fodder(self): ++ rc, out, err = assert_python_ok(self.script, self.fodder) ++ self.assertEqual(rc, 0) ++ self.assertTrue(out.startswith(self.fodder_md5)) ++ for part in self.fodder.split(os.path.sep): ++ self.assertIn(part.encode(), out) ++ self.assertFalse(err) ++ ++ def test_dash_l(self): ++ rc, out, err = assert_python_ok(self.script, '-l', self.fodder) ++ self.assertEqual(rc, 0) ++ self.assertIn(self.fodder_md5, out) ++ parts = self.fodder.split(os.path.sep) ++ self.assertIn(parts[-1].encode(), out) ++ self.assertNotIn(parts[-2].encode(), out) ++ ++ def test_dash_t(self): ++ rc, out, err = assert_python_ok(self.script, '-t', self.fodder) ++ self.assertEqual(rc, 0) ++ self.assertTrue(out.startswith(self.fodder_textmode_md5)) ++ self.assertNotIn(self.fodder_md5, out) ++ ++ def test_dash_s(self): ++ rc, out, err = assert_python_ok(self.script, '-s', '512', self.fodder) ++ self.assertEqual(rc, 0) ++ self.assertIn(self.fodder_md5, out) ++ ++ def test_multiple_files(self): ++ rc, out, err = assert_python_ok(self.script, self.fodder, self.fodder) ++ self.assertEqual(rc, 0) ++ lines = out.splitlines() ++ self.assertEqual(len(lines), 2) ++ self.assertEqual(*lines) ++ ++ def test_usage(self): ++ rc, out, err = assert_python_failure(self.script, '-h') ++ self.assertEqual(rc, 2) ++ self.assertEqual(out, b'') ++ self.assertGreater(err, b'') ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_pdeps.py +--- /dev/null ++++ b/Lib/test/test_tools/test_pdeps.py +@@ -0,0 +1,34 @@ ++"""Tests for the pdeps script in the Tools directory.""" ++ ++import os ++import sys ++import unittest ++import tempfile ++from test import support ++ ++from test.test_tools import scriptsdir, skip_if_missing, import_tool ++ ++skip_if_missing() ++ ++ ++class PdepsTests(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(self): ++ self.pdeps = import_tool('pdeps') ++ ++ def test_process_errors(self): ++ # Issue #14492: m_import.match(line) can be None. ++ with tempfile.TemporaryDirectory() as tmpdir: ++ fn = os.path.join(tmpdir, 'foo') ++ with open(fn, 'w') as stream: ++ stream.write("#!/this/will/fail") ++ self.pdeps.process(fn, {}) ++ ++ def test_inverse_attribute_error(self): ++ # Issue #14492: this used to fail with an AttributeError. ++ self.pdeps.inverse({'a': []}) ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_pindent.py +--- /dev/null ++++ b/Lib/test/test_tools/test_pindent.py +@@ -0,0 +1,339 @@ ++"""Tests for the pindent script in the Tools directory.""" ++ ++import os ++import sys ++import unittest ++import subprocess ++import textwrap ++from test import support ++from test.script_helper import assert_python_ok ++ ++from test.test_tools import scriptsdir, skip_if_missing ++ ++skip_if_missing() ++ ++ ++class PindentTests(unittest.TestCase): ++ script = os.path.join(scriptsdir, 'pindent.py') ++ ++ def assertFileEqual(self, fn1, fn2): ++ with open(fn1) as f1, open(fn2) as f2: ++ self.assertEqual(f1.readlines(), f2.readlines()) ++ ++ def pindent(self, source, *args): ++ with subprocess.Popen( ++ (sys.executable, self.script) + args, ++ stdin=subprocess.PIPE, stdout=subprocess.PIPE, ++ universal_newlines=True) as proc: ++ out, err = proc.communicate(source) ++ self.assertIsNone(err) ++ return out ++ ++ def lstriplines(self, data): ++ return '\n'.join(line.lstrip() for line in data.splitlines()) + '\n' ++ ++ def test_selftest(self): ++ self.maxDiff = None ++ with support.temp_dir() as directory: ++ data_path = os.path.join(directory, '_test.py') ++ with open(self.script) as f: ++ closed = f.read() ++ with open(data_path, 'w') as f: ++ f.write(closed) ++ ++ rc, out, err = assert_python_ok(self.script, '-d', data_path) ++ self.assertEqual(out, b'') ++ self.assertEqual(err, b'') ++ backup = data_path + '~' ++ self.assertTrue(os.path.exists(backup)) ++ with open(backup) as f: ++ self.assertEqual(f.read(), closed) ++ with open(data_path) as f: ++ clean = f.read() ++ compile(clean, '_test.py', 'exec') ++ self.assertEqual(self.pindent(clean, '-c'), closed) ++ self.assertEqual(self.pindent(closed, '-d'), clean) ++ ++ rc, out, err = assert_python_ok(self.script, '-c', data_path) ++ self.assertEqual(out, b'') ++ self.assertEqual(err, b'') ++ with open(backup) as f: ++ self.assertEqual(f.read(), clean) ++ with open(data_path) as f: ++ self.assertEqual(f.read(), closed) ++ ++ broken = self.lstriplines(closed) ++ with open(data_path, 'w') as f: ++ f.write(broken) ++ rc, out, err = assert_python_ok(self.script, '-r', data_path) ++ self.assertEqual(out, b'') ++ self.assertEqual(err, b'') ++ with open(backup) as f: ++ self.assertEqual(f.read(), broken) ++ with open(data_path) as f: ++ indented = f.read() ++ compile(indented, '_test.py', 'exec') ++ self.assertEqual(self.pindent(broken, '-r'), indented) ++ ++ def pindent_test(self, clean, closed): ++ self.assertEqual(self.pindent(clean, '-c'), closed) ++ self.assertEqual(self.pindent(closed, '-d'), clean) ++ broken = self.lstriplines(closed) ++ self.assertEqual(self.pindent(broken, '-r', '-e', '-s', '4'), closed) ++ ++ def test_statements(self): ++ clean = textwrap.dedent("""\ ++ if a: ++ pass ++ ++ if a: ++ pass ++ else: ++ pass ++ ++ if a: ++ pass ++ elif: ++ pass ++ else: ++ pass ++ ++ while a: ++ break ++ ++ while a: ++ break ++ else: ++ pass ++ ++ for i in a: ++ break ++ ++ for i in a: ++ break ++ else: ++ pass ++ ++ try: ++ pass ++ finally: ++ pass ++ ++ try: ++ pass ++ except TypeError: ++ pass ++ except ValueError: ++ pass ++ else: ++ pass ++ ++ try: ++ pass ++ except TypeError: ++ pass ++ except ValueError: ++ pass ++ finally: ++ pass ++ ++ with a: ++ pass ++ ++ class A: ++ pass ++ ++ def f(): ++ pass ++ """) ++ ++ closed = textwrap.dedent("""\ ++ if a: ++ pass ++ # end if ++ ++ if a: ++ pass ++ else: ++ pass ++ # end if ++ ++ if a: ++ pass ++ elif: ++ pass ++ else: ++ pass ++ # end if ++ ++ while a: ++ break ++ # end while ++ ++ while a: ++ break ++ else: ++ pass ++ # end while ++ ++ for i in a: ++ break ++ # end for ++ ++ for i in a: ++ break ++ else: ++ pass ++ # end for ++ ++ try: ++ pass ++ finally: ++ pass ++ # end try ++ ++ try: ++ pass ++ except TypeError: ++ pass ++ except ValueError: ++ pass ++ else: ++ pass ++ # end try ++ ++ try: ++ pass ++ except TypeError: ++ pass ++ except ValueError: ++ pass ++ finally: ++ pass ++ # end try ++ ++ with a: ++ pass ++ # end with ++ ++ class A: ++ pass ++ # end class A ++ ++ def f(): ++ pass ++ # end def f ++ """) ++ self.pindent_test(clean, closed) ++ ++ def test_multilevel(self): ++ clean = textwrap.dedent("""\ ++ def foobar(a, b): ++ if a == b: ++ a = a+1 ++ elif a < b: ++ b = b-1 ++ if b > a: a = a-1 ++ else: ++ print 'oops!' ++ """) ++ closed = textwrap.dedent("""\ ++ def foobar(a, b): ++ if a == b: ++ a = a+1 ++ elif a < b: ++ b = b-1 ++ if b > a: a = a-1 ++ # end if ++ else: ++ print 'oops!' ++ # end if ++ # end def foobar ++ """) ++ self.pindent_test(clean, closed) ++ ++ def test_preserve_indents(self): ++ clean = textwrap.dedent("""\ ++ if a: ++ if b: ++ pass ++ """) ++ closed = textwrap.dedent("""\ ++ if a: ++ if b: ++ pass ++ # end if ++ # end if ++ """) ++ self.assertEqual(self.pindent(clean, '-c'), closed) ++ self.assertEqual(self.pindent(closed, '-d'), clean) ++ broken = self.lstriplines(closed) ++ self.assertEqual(self.pindent(broken, '-r', '-e', '-s', '9'), closed) ++ clean = textwrap.dedent("""\ ++ if a: ++ \tif b: ++ \t\tpass ++ """) ++ closed = textwrap.dedent("""\ ++ if a: ++ \tif b: ++ \t\tpass ++ \t# end if ++ # end if ++ """) ++ self.assertEqual(self.pindent(clean, '-c'), closed) ++ self.assertEqual(self.pindent(closed, '-d'), clean) ++ broken = self.lstriplines(closed) ++ self.assertEqual(self.pindent(broken, '-r'), closed) ++ ++ def test_escaped_newline(self): ++ clean = textwrap.dedent("""\ ++ class\\ ++ \\ ++ A: ++ def\ ++ \\ ++ f: ++ pass ++ """) ++ closed = textwrap.dedent("""\ ++ class\\ ++ \\ ++ A: ++ def\ ++ \\ ++ f: ++ pass ++ # end def f ++ # end class A ++ """) ++ self.assertEqual(self.pindent(clean, '-c'), closed) ++ self.assertEqual(self.pindent(closed, '-d'), clean) ++ ++ def test_empty_line(self): ++ clean = textwrap.dedent("""\ ++ if a: ++ ++ pass ++ """) ++ closed = textwrap.dedent("""\ ++ if a: ++ ++ pass ++ # end if ++ """) ++ self.pindent_test(clean, closed) ++ ++ def test_oneline(self): ++ clean = textwrap.dedent("""\ ++ if a: pass ++ """) ++ closed = textwrap.dedent("""\ ++ if a: pass ++ # end if ++ """) ++ self.pindent_test(clean, closed) ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_reindent.py +--- /dev/null ++++ b/Lib/test/test_tools/test_reindent.py +@@ -0,0 +1,28 @@ ++"""Tests for scripts in the Tools directory. ++ ++This file contains regression tests for some of the scripts found in the ++Tools directory of a Python checkout or tarball, such as reindent.py. ++""" ++ ++import os ++import unittest ++from test.script_helper import assert_python_ok ++ ++from test.test_tools import scriptsdir, skip_if_missing ++ ++skip_if_missing() ++ ++class ReindentTests(unittest.TestCase): ++ script = os.path.join(scriptsdir, 'reindent.py') ++ ++ def test_noargs(self): ++ assert_python_ok(self.script) ++ ++ def test_help(self): ++ rc, out, err = assert_python_ok(self.script, '-h') ++ self.assertEqual(out, b'') ++ self.assertGreater(err, b'') ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_sundry.py +--- /dev/null ++++ b/Lib/test/test_tools/test_sundry.py +@@ -0,0 +1,53 @@ ++"""Tests for scripts in the Tools directory. ++ ++This file contains extremely basic regression tests for the scripts found in ++the Tools directory of a Python checkout or tarball which don't have separate ++tests of their own, such as h2py.py. ++""" ++ ++import os ++import sys ++import unittest ++from test import support ++ ++from test.test_tools import scriptsdir, import_tool, skip_if_missing ++ ++skip_if_missing() ++ ++class TestSundryScripts(unittest.TestCase): ++ # At least make sure the rest don't have syntax errors. When tests are ++ # added for a script it should be added to the whitelist below. ++ ++ # scripts that have independent tests. ++ whitelist = ['reindent', 'pdeps', 'gprof2html', 'md5sum'] ++ # scripts that can't be imported without running ++ blacklist = ['make_ctype'] ++ # scripts that use windows-only modules ++ windows_only = ['win_add2path'] ++ # blacklisted for other reasons ++ other = ['analyze_dxp'] ++ ++ skiplist = blacklist + whitelist + windows_only + other ++ ++ def test_sundry(self): ++ for fn in os.listdir(scriptsdir): ++ name = fn[:-3] ++ if fn.endswith('.py') and name not in self.skiplist: ++ import_tool(name) ++ ++ @unittest.skipIf(sys.platform != "win32", "Windows-only test") ++ def test_sundry_windows(self): ++ for name in self.windows_only: ++ import_tool(name) ++ ++ @unittest.skipIf(not support.threading, "test requires _thread module") ++ def test_analyze_dxp_import(self): ++ if hasattr(sys, 'getdxp'): ++ import_tool('analyze_dxp') ++ else: ++ with self.assertRaises(RuntimeError): ++ import_tool('analyze_dxp') ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tools/test_unparse.py +--- /dev/null ++++ b/Lib/test/test_tools/test_unparse.py +@@ -0,0 +1,282 @@ ++"""Tests for the unparse.py script in the Tools/parser directory.""" ++ ++import unittest ++import test.support ++import io ++import os ++import random ++import tokenize ++import ast ++ ++from test.test_tools import basepath, toolsdir, skip_if_missing ++ ++skip_if_missing() ++ ++parser_path = os.path.join(toolsdir, "parser") ++ ++with test.support.DirsOnSysPath(parser_path): ++ import unparse ++ ++def read_pyfile(filename): ++ """Read and return the contents of a Python source file (as a ++ string), taking into account the file encoding.""" ++ with open(filename, "rb") as pyfile: ++ encoding = tokenize.detect_encoding(pyfile.readline)[0] ++ with open(filename, "r", encoding=encoding) as pyfile: ++ source = pyfile.read() ++ return source ++ ++for_else = """\ ++def f(): ++ for x in range(10): ++ break ++ else: ++ y = 2 ++ z = 3 ++""" ++ ++while_else = """\ ++def g(): ++ while True: ++ break ++ else: ++ y = 2 ++ z = 3 ++""" ++ ++relative_import = """\ ++from . import fred ++from .. import barney ++from .australia import shrimp as prawns ++""" ++ ++nonlocal_ex = """\ ++def f(): ++ x = 1 ++ def g(): ++ nonlocal x ++ x = 2 ++ y = 7 ++ def h(): ++ nonlocal x, y ++""" ++ ++# also acts as test for 'except ... as ...' ++raise_from = """\ ++try: ++ 1 / 0 ++except ZeroDivisionError as e: ++ raise ArithmeticError from e ++""" ++ ++class_decorator = """\ ++@f1(arg) ++@f2 ++class Foo: pass ++""" ++ ++elif1 = """\ ++if cond1: ++ suite1 ++elif cond2: ++ suite2 ++else: ++ suite3 ++""" ++ ++elif2 = """\ ++if cond1: ++ suite1 ++elif cond2: ++ suite2 ++""" ++ ++try_except_finally = """\ ++try: ++ suite1 ++except ex1: ++ suite2 ++except ex2: ++ suite3 ++else: ++ suite4 ++finally: ++ suite5 ++""" ++ ++with_simple = """\ ++with f(): ++ suite1 ++""" ++ ++with_as = """\ ++with f() as x: ++ suite1 ++""" ++ ++with_two_items = """\ ++with f() as x, g() as y: ++ suite1 ++""" ++ ++class ASTTestCase(unittest.TestCase): ++ def assertASTEqual(self, ast1, ast2): ++ self.assertEqual(ast.dump(ast1), ast.dump(ast2)) ++ ++ def check_roundtrip(self, code1, filename="internal"): ++ ast1 = compile(code1, filename, "exec", ast.PyCF_ONLY_AST) ++ unparse_buffer = io.StringIO() ++ unparse.Unparser(ast1, unparse_buffer) ++ code2 = unparse_buffer.getvalue() ++ ast2 = compile(code2, filename, "exec", ast.PyCF_ONLY_AST) ++ self.assertASTEqual(ast1, ast2) ++ ++class UnparseTestCase(ASTTestCase): ++ # Tests for specific bugs found in earlier versions of unparse ++ ++ def test_del_statement(self): ++ self.check_roundtrip("del x, y, z") ++ ++ def test_shifts(self): ++ self.check_roundtrip("45 << 2") ++ self.check_roundtrip("13 >> 7") ++ ++ def test_for_else(self): ++ self.check_roundtrip(for_else) ++ ++ def test_while_else(self): ++ self.check_roundtrip(while_else) ++ ++ def test_unary_parens(self): ++ self.check_roundtrip("(-1)**7") ++ self.check_roundtrip("(-1.)**8") ++ self.check_roundtrip("(-1j)**6") ++ self.check_roundtrip("not True or False") ++ self.check_roundtrip("True or not False") ++ ++ def test_integer_parens(self): ++ self.check_roundtrip("3 .__abs__()") ++ ++ def test_huge_float(self): ++ self.check_roundtrip("1e1000") ++ self.check_roundtrip("-1e1000") ++ self.check_roundtrip("1e1000j") ++ self.check_roundtrip("-1e1000j") ++ ++ def test_min_int(self): ++ self.check_roundtrip(str(-2**31)) ++ self.check_roundtrip(str(-2**63)) ++ ++ def test_imaginary_literals(self): ++ self.check_roundtrip("7j") ++ self.check_roundtrip("-7j") ++ self.check_roundtrip("0j") ++ self.check_roundtrip("-0j") ++ ++ def test_lambda_parentheses(self): ++ self.check_roundtrip("(lambda: int)()") ++ ++ def test_chained_comparisons(self): ++ self.check_roundtrip("1 < 4 <= 5") ++ self.check_roundtrip("a is b is c is not d") ++ ++ def test_function_arguments(self): ++ self.check_roundtrip("def f(): pass") ++ self.check_roundtrip("def f(a): pass") ++ self.check_roundtrip("def f(b = 2): pass") ++ self.check_roundtrip("def f(a, b): pass") ++ self.check_roundtrip("def f(a, b = 2): pass") ++ self.check_roundtrip("def f(a = 5, b = 2): pass") ++ self.check_roundtrip("def f(*, a = 1, b = 2): pass") ++ self.check_roundtrip("def f(*, a = 1, b): pass") ++ self.check_roundtrip("def f(*, a, b = 2): pass") ++ self.check_roundtrip("def f(a, b = None, *, c, **kwds): pass") ++ self.check_roundtrip("def f(a=2, *args, c=5, d, **kwds): pass") ++ self.check_roundtrip("def f(*args, **kwargs): pass") ++ ++ def test_relative_import(self): ++ self.check_roundtrip(relative_import) ++ ++ def test_nonlocal(self): ++ self.check_roundtrip(nonlocal_ex) ++ ++ def test_raise_from(self): ++ self.check_roundtrip(raise_from) ++ ++ def test_bytes(self): ++ self.check_roundtrip("b'123'") ++ ++ def test_annotations(self): ++ self.check_roundtrip("def f(a : int): pass") ++ self.check_roundtrip("def f(a: int = 5): pass") ++ self.check_roundtrip("def f(*args: [int]): pass") ++ self.check_roundtrip("def f(**kwargs: dict): pass") ++ self.check_roundtrip("def f() -> None: pass") ++ ++ def test_set_literal(self): ++ self.check_roundtrip("{'a', 'b', 'c'}") ++ ++ def test_set_comprehension(self): ++ self.check_roundtrip("{x for x in range(5)}") ++ ++ def test_dict_comprehension(self): ++ self.check_roundtrip("{x: x*x for x in range(10)}") ++ ++ def test_class_decorators(self): ++ self.check_roundtrip(class_decorator) ++ ++ def test_class_definition(self): ++ self.check_roundtrip("class A(metaclass=type, *[], **{}): pass") ++ ++ def test_elifs(self): ++ self.check_roundtrip(elif1) ++ self.check_roundtrip(elif2) ++ ++ def test_try_except_finally(self): ++ self.check_roundtrip(try_except_finally) ++ ++ def test_starred_assignment(self): ++ self.check_roundtrip("a, *b, c = seq") ++ self.check_roundtrip("a, (*b, c) = seq") ++ self.check_roundtrip("a, *b[0], c = seq") ++ self.check_roundtrip("a, *(b, c) = seq") ++ ++ def test_with_simple(self): ++ self.check_roundtrip(with_simple) ++ ++ def test_with_as(self): ++ self.check_roundtrip(with_as) ++ ++ def test_with_two_items(self): ++ self.check_roundtrip(with_two_items) ++ ++ ++class DirectoryTestCase(ASTTestCase): ++ """Test roundtrip behaviour on all files in Lib and Lib/test.""" ++ ++ # test directories, relative to the root of the distribution ++ test_directories = 'Lib', os.path.join('Lib', 'test') ++ ++ def test_files(self): ++ # get names of files to test ++ ++ names = [] ++ for d in self.test_directories: ++ test_dir = os.path.join(basepath, d) ++ for n in os.listdir(test_dir): ++ if n.endswith('.py') and not n.startswith('bad'): ++ names.append(os.path.join(test_dir, n)) ++ ++ # Test limited subset of files unless the 'cpu' resource is specified. ++ if not test.support.is_resource_enabled("cpu"): ++ names = random.sample(names, 10) ++ ++ for filename in names: ++ if test.support.verbose: ++ print('Testing %s' % filename) ++ source = read_pyfile(filename) ++ self.check_roundtrip(source) ++ ++ ++if __name__ == '__main__': ++ unittest.main() +diff -r c0e311e010fc Lib/test/test_tracemalloc.py +--- a/Lib/test/test_tracemalloc.py ++++ b/Lib/test/test_tracemalloc.py +@@ -807,6 +807,12 @@ + b'number of frames', + stderr) + ++ def test_pymem_alloc0(self): ++ # Issue #21639: Check that PyMem_Malloc(0) with tracemalloc enabled ++ # does not crash. ++ code = 'import _testcapi; _testcapi.test_pymem_alloc0(); 1' ++ assert_python_ok('-X', 'tracemalloc', '-c', code) ++ + + def test_main(): + support.run_unittest( +diff -r c0e311e010fc Lib/test/test_ttk_guionly.py +--- a/Lib/test/test_ttk_guionly.py ++++ b/Lib/test/test_ttk_guionly.py +@@ -22,13 +22,7 @@ + # assuming ttk is not available + raise unittest.SkipTest("ttk not available: %s" % msg) + +-def test_main(enable_gui=False): +- if enable_gui: +- if support.use_resources is None: +- support.use_resources = ['gui'] +- elif 'gui' not in support.use_resources: +- support.use_resources.append('gui') +- ++def test_main(): + try: + support.run_unittest( + *runtktests.get_tests(text=False, packages=['test_ttk'])) +@@ -36,4 +30,4 @@ + get_tk_root().destroy() + + if __name__ == '__main__': +- test_main(enable_gui=True) ++ test_main() +diff -r c0e311e010fc Lib/test/test_urllib.py +--- a/Lib/test/test_urllib.py ++++ b/Lib/test/test_urllib.py +@@ -7,6 +7,7 @@ + import email.message + import io + import unittest ++from unittest.mock import patch + from test import support + import os + import sys +@@ -89,6 +90,26 @@ + http.client.HTTPConnection = self._connection_class + + ++class FakeFTPMixin(object): ++ def fakeftp(self): ++ class FakeFtpWrapper(object): ++ def __init__(self, user, passwd, host, port, dirs, timeout=None, ++ persistent=True): ++ pass ++ ++ def retrfile(self, file, type): ++ return io.BytesIO(), 0 ++ ++ def close(self): ++ pass ++ ++ self._ftpwrapper_class = urllib.request.ftpwrapper ++ urllib.request.ftpwrapper = FakeFtpWrapper ++ ++ def unfakeftp(self): ++ urllib.request.ftpwrapper = self._ftpwrapper_class ++ ++ + class urlopen_FileTests(unittest.TestCase): + """Test urlopen() opening a temporary file. + +@@ -195,7 +216,7 @@ + self.env.set('NO_PROXY', 'localhost, anotherdomain.com, newdomain.com') + self.assertTrue(urllib.request.proxy_bypass_environment('anotherdomain.com')) + +-class urlopen_HttpTests(unittest.TestCase, FakeHTTPMixin): ++class urlopen_HttpTests(unittest.TestCase, FakeHTTPMixin, FakeFTPMixin): + """Test urlopen() opening a fake http connection.""" + + def check_read(self, ver): +@@ -309,6 +330,15 @@ + self.assertFalse(e.exception.filename) + self.assertTrue(e.exception.reason) + ++ @patch.object(urllib.request, 'MAXFTPCACHE', 0) ++ def test_ftp_cache_pruning(self): ++ self.fakeftp() ++ try: ++ urllib.request.ftpcache['test'] = urllib.request.ftpwrapper('user', 'pass', 'localhost', 21, []) ++ urlopen('ftp://localhost') ++ finally: ++ self.unfakeftp() ++ + + def test_userpass_inurl(self): + self.fakehttp(b"HTTP/1.0 200 OK\r\n\r\nHello!") +diff -r c0e311e010fc Lib/test/test_urllib2.py +--- a/Lib/test/test_urllib2.py ++++ b/Lib/test/test_urllib2.py +@@ -678,7 +678,7 @@ + self.assertEqual(int(headers["Content-length"]), len(data)) + + def test_file(self): +- import email.utils, socket ++ import email.utils + h = urllib.request.FileHandler() + o = h.parent = MockOpener() + +@@ -725,6 +725,7 @@ + for url in [ + "file://localhost:80%s" % urlpath, + "file:///file_does_not_exist.txt", ++ "file://not-a-local-host.com//dir/file.txt", + "file://%s:80%s/%s" % (socket.gethostbyname('localhost'), + os.getcwd(), TESTFN), + "file://somerandomhost.ontheinternet.com%s/%s" % +diff -r c0e311e010fc Lib/test/test_venv.py +--- a/Lib/test/test_venv.py ++++ b/Lib/test/test_venv.py +@@ -203,17 +203,22 @@ + """ + Test upgrading an existing environment directory. + """ +- builder = venv.EnvBuilder(upgrade=True) +- self.run_with_capture(builder.create, self.env_dir) +- self.isdir(self.bindir) +- self.isdir(self.include) +- self.isdir(*self.lib) +- fn = self.get_env_file(self.bindir, self.exe) +- if not os.path.exists(fn): # diagnostics for Windows buildbot failures +- bd = self.get_env_file(self.bindir) +- print('Contents of %r:' % bd) +- print(' %r' % os.listdir(bd)) +- self.assertTrue(os.path.exists(fn), 'File %r should exist.' % fn) ++ # See Issue #21643: the loop needs to run twice to ensure ++ # that everything works on the upgrade (the first run just creates ++ # the venv). ++ for upgrade in (False, True): ++ builder = venv.EnvBuilder(upgrade=upgrade) ++ self.run_with_capture(builder.create, self.env_dir) ++ self.isdir(self.bindir) ++ self.isdir(self.include) ++ self.isdir(*self.lib) ++ fn = self.get_env_file(self.bindir, self.exe) ++ if not os.path.exists(fn): ++ # diagnostics for Windows buildbot failures ++ bd = self.get_env_file(self.bindir) ++ print('Contents of %r:' % bd) ++ print(' %r' % os.listdir(bd)) ++ self.assertTrue(os.path.exists(fn), 'File %r should exist.' % fn) + + def test_isolation(self): + """ +diff -r c0e311e010fc Lib/test/test_winreg.py +--- a/Lib/test/test_winreg.py ++++ b/Lib/test/test_winreg.py +@@ -341,7 +341,7 @@ + def test_queryvalueex_return_value(self): + # Test for Issue #16759, return unsigned int from QueryValueEx. + # Reg2Py, which gets called by QueryValueEx, was returning a value +- # generated by PyLong_FromLong. The implmentation now uses ++ # generated by PyLong_FromLong. The implementation now uses + # PyLong_FromUnsignedLong to match DWORD's size. + try: + with CreateKey(HKEY_CURRENT_USER, test_key_name) as ck: +@@ -354,6 +354,19 @@ + finally: + DeleteKey(HKEY_CURRENT_USER, test_key_name) + ++ def test_setvalueex_crash_with_none_arg(self): ++ # Test for Issue #21151, segfault when None is passed to SetValueEx ++ try: ++ with CreateKey(HKEY_CURRENT_USER, test_key_name) as ck: ++ self.assertNotEqual(ck.handle, 0) ++ test_val = None ++ SetValueEx(ck, "test_name", 0, REG_BINARY, test_val) ++ ret_val, ret_type = QueryValueEx(ck, "test_name") ++ self.assertEqual(ret_type, REG_BINARY) ++ self.assertEqual(ret_val, test_val) ++ finally: ++ DeleteKey(HKEY_CURRENT_USER, test_key_name) ++ + + + @unittest.skipUnless(REMOTE_NAME, "Skipping remote registry tests") +diff -r c0e311e010fc Lib/test/test_zipfile.py +--- a/Lib/test/test_zipfile.py ++++ b/Lib/test/test_zipfile.py +@@ -1290,6 +1290,21 @@ + self.assertRaises(ValueError, + zipfile.ZipInfo, 'seventies', (1979, 1, 1, 0, 0, 0)) + ++ def test_zipfile_with_short_extra_field(self): ++ """If an extra field in the header is less than 4 bytes, skip it.""" ++ zipdata = ( ++ b'PK\x03\x04\x14\x00\x00\x00\x00\x00\x93\x9b\xad@\x8b\x9e' ++ b'\xd9\xd3\x01\x00\x00\x00\x01\x00\x00\x00\x03\x00\x03\x00ab' ++ b'c\x00\x00\x00APK\x01\x02\x14\x03\x14\x00\x00\x00\x00' ++ b'\x00\x93\x9b\xad@\x8b\x9e\xd9\xd3\x01\x00\x00\x00\x01\x00\x00' ++ b'\x00\x03\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa4\x81\x00' ++ b'\x00\x00\x00abc\x00\x00PK\x05\x06\x00\x00\x00\x00' ++ b'\x01\x00\x01\x003\x00\x00\x00%\x00\x00\x00\x00\x00' ++ ) ++ with zipfile.ZipFile(io.BytesIO(zipdata), 'r') as zipf: ++ # testzip returns the name of the first corrupt file, or None ++ self.assertIsNone(zipf.testzip()) ++ + def tearDown(self): + unlink(TESTFN) + unlink(TESTFN2) +diff -r c0e311e010fc Lib/tkinter/__init__.py +--- a/Lib/tkinter/__init__.py ++++ b/Lib/tkinter/__init__.py +@@ -421,7 +421,10 @@ + + _flatten(args) + _flatten(list(kw.items()))) + def tk_menuBar(self, *args): + """Do not use. Needed in Tk 3.6 and earlier.""" +- pass # obsolete since Tk 4.0 ++ # obsolete since Tk 4.0 ++ import warnings ++ warnings.warn('tk_menuBar() does nothing and will be removed in 3.6', ++ DeprecationWarning, stacklevel=2) + def wait_variable(self, name='PY_VAR'): + """Wait until the variable is modified. + +@@ -2586,22 +2589,19 @@ + def activate(self, index): + """Activate item identified by INDEX.""" + self.tk.call(self._w, 'activate', index) +- def bbox(self, *args): ++ def bbox(self, index): + """Return a tuple of X1,Y1,X2,Y2 coordinates for a rectangle +- which encloses the item identified by index in ARGS.""" +- return self._getints( +- self.tk.call((self._w, 'bbox') + args)) or None ++ which encloses the item identified by the given index.""" ++ return self._getints(self.tk.call(self._w, 'bbox', index)) or None + def curselection(self): +- """Return list of indices of currently selected item.""" +- # XXX Ought to apply self._getints()... +- return self.tk.splitlist(self.tk.call( +- self._w, 'curselection')) ++ """Return the indices of currently selected item.""" ++ return self._getints(self.tk.call(self._w, 'curselection')) or () + def delete(self, first, last=None): +- """Delete items from FIRST to LAST (not included).""" ++ """Delete items from FIRST to LAST (included).""" + self.tk.call(self._w, 'delete', first, last) + def get(self, first, last=None): +- """Get list of items from FIRST to LAST (not included).""" +- if last: ++ """Get list of items from FIRST to LAST (included).""" ++ if last is not None: + return self.tk.splitlist(self.tk.call( + self._w, 'get', first, last)) + else: +@@ -2634,7 +2634,7 @@ + self.tk.call(self._w, 'selection', 'anchor', index) + select_anchor = selection_anchor + def selection_clear(self, first, last=None): +- """Clear the selection from FIRST to LAST (not included).""" ++ """Clear the selection from FIRST to LAST (included).""" + self.tk.call(self._w, + 'selection', 'clear', first, last) + select_clear = selection_clear +@@ -2644,7 +2644,7 @@ + self._w, 'selection', 'includes', index)) + select_includes = selection_includes + def selection_set(self, first, last=None): +- """Set the selection from FIRST to LAST (not included) without ++ """Set the selection from FIRST to LAST (included) without + changing the currently selected elements.""" + self.tk.call(self._w, 'selection', 'set', first, last) + select_set = selection_set +@@ -2677,7 +2677,11 @@ + selectcolor, takefocus, tearoff, tearoffcommand, title, type.""" + Widget.__init__(self, master, 'menu', cnf, kw) + def tk_bindForTraversal(self): +- pass # obsolete since Tk 4.0 ++ # obsolete since Tk 4.0 ++ import warnings ++ warnings.warn('tk_bindForTraversal() does nothing and ' ++ 'will be removed in 3.6', ++ DeprecationWarning, stacklevel=2) + def tk_mbPost(self): + self.tk.call('tk_mbPost', self._w) + def tk_mbUnpost(self): +diff -r c0e311e010fc Lib/tkinter/font.py +--- a/Lib/tkinter/font.py ++++ b/Lib/tkinter/font.py +@@ -81,7 +81,8 @@ + if exists: + self.delete_font = False + # confirm font exists +- if self.name not in root.tk.call("font", "names"): ++ if self.name not in root.tk.splitlist( ++ root.tk.call("font", "names")): + raise tkinter._tkinter.TclError( + "named font %s does not already exist" % (self.name,)) + # if font config info supplied, apply it +diff -r c0e311e010fc Lib/tkinter/test/runtktests.py +--- a/Lib/tkinter/test/runtktests.py ++++ b/Lib/tkinter/test/runtktests.py +@@ -68,5 +68,4 @@ + yield test + + if __name__ == "__main__": +- test.support.use_resources = ['gui'] + test.support.run_unittest(*get_tests()) +diff -r c0e311e010fc Lib/tkinter/test/test_tkinter/test_images.py +--- /dev/null ++++ b/Lib/tkinter/test/test_tkinter/test_images.py +@@ -0,0 +1,341 @@ ++import unittest ++import tkinter ++from tkinter import ttk ++from test import support ++from tkinter.test.support import requires_tcl ++ ++support.requires('gui') ++ ++ ++class MiscTest(unittest.TestCase): ++ ++ def setUp(self): ++ self.root = ttk.setup_master() ++ ++ def test_image_types(self): ++ image_types = self.root.image_types() ++ self.assertIsInstance(image_types, tuple) ++ self.assertIn('photo', image_types) ++ self.assertIn('bitmap', image_types) ++ ++ def test_image_names(self): ++ image_names = self.root.image_names() ++ self.assertIsInstance(image_names, tuple) ++ ++ ++class BitmapImageTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ cls.testfile = support.findfile('python.xbm', subdir='imghdrdata') ++ ++ def setUp(self): ++ self.root = ttk.setup_master() ++ ++ def test_create_from_file(self): ++ image = tkinter.BitmapImage('::img::test', master=self.root, ++ foreground='yellow', background='blue', ++ file=self.testfile) ++ self.assertEqual(str(image), '::img::test') ++ self.assertEqual(image.type(), 'bitmap') ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ self.assertIn('::img::test', self.root.image_names()) ++ del image ++ self.assertNotIn('::img::test', self.root.image_names()) ++ ++ def test_create_from_data(self): ++ with open(self.testfile, 'rb') as f: ++ data = f.read() ++ image = tkinter.BitmapImage('::img::test', master=self.root, ++ foreground='yellow', background='blue', ++ data=data) ++ self.assertEqual(str(image), '::img::test') ++ self.assertEqual(image.type(), 'bitmap') ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ self.assertIn('::img::test', self.root.image_names()) ++ del image ++ self.assertNotIn('::img::test', self.root.image_names()) ++ ++ def assertEqualStrList(self, actual, expected): ++ self.assertIsInstance(actual, str) ++ self.assertEqual(self.root.splitlist(actual), expected) ++ ++ def test_configure_data(self): ++ image = tkinter.BitmapImage('::img::test', master=self.root) ++ self.assertEqual(image['data'], '-data {} {} {} {}') ++ with open(self.testfile, 'rb') as f: ++ data = f.read() ++ image.configure(data=data) ++ self.assertEqualStrList(image['data'], ++ ('-data', '', '', '', data.decode('ascii'))) ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ ++ self.assertEqual(image['maskdata'], '-maskdata {} {} {} {}') ++ image.configure(maskdata=data) ++ self.assertEqualStrList(image['maskdata'], ++ ('-maskdata', '', '', '', data.decode('ascii'))) ++ ++ def test_configure_file(self): ++ image = tkinter.BitmapImage('::img::test', master=self.root) ++ self.assertEqual(image['file'], '-file {} {} {} {}') ++ image.configure(file=self.testfile) ++ self.assertEqualStrList(image['file'], ++ ('-file', '', '', '',self.testfile)) ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ ++ self.assertEqual(image['maskfile'], '-maskfile {} {} {} {}') ++ image.configure(maskfile=self.testfile) ++ self.assertEqualStrList(image['maskfile'], ++ ('-maskfile', '', '', '', self.testfile)) ++ ++ def test_configure_background(self): ++ image = tkinter.BitmapImage('::img::test', master=self.root) ++ self.assertEqual(image['background'], '-background {} {} {} {}') ++ image.configure(background='blue') ++ self.assertEqual(image['background'], '-background {} {} {} blue') ++ ++ def test_configure_foreground(self): ++ image = tkinter.BitmapImage('::img::test', master=self.root) ++ self.assertEqual(image['foreground'], ++ '-foreground {} {} #000000 #000000') ++ image.configure(foreground='yellow') ++ self.assertEqual(image['foreground'], ++ '-foreground {} {} #000000 yellow') ++ ++ ++class PhotoImageTest(unittest.TestCase): ++ ++ @classmethod ++ def setUpClass(cls): ++ cls.testfile = support.findfile('python.gif', subdir='imghdrdata') ++ ++ def setUp(self): ++ self.root = ttk.setup_master() ++ self.wantobjects = self.root.wantobjects() ++ ++ def create(self): ++ return tkinter.PhotoImage('::img::test', master=self.root, ++ file=self.testfile) ++ ++ def colorlist(self, *args): ++ if tkinter.TkVersion >= 8.6 and self.wantobjects: ++ return args ++ else: ++ return tkinter._join(args) ++ ++ def check_create_from_file(self, ext): ++ testfile = support.findfile('python.' + ext, subdir='imghdrdata') ++ image = tkinter.PhotoImage('::img::test', master=self.root, ++ file=testfile) ++ self.assertEqual(str(image), '::img::test') ++ self.assertEqual(image.type(), 'photo') ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ self.assertEqual(image['data'], '') ++ self.assertEqual(image['file'], testfile) ++ self.assertIn('::img::test', self.root.image_names()) ++ del image ++ self.assertNotIn('::img::test', self.root.image_names()) ++ ++ def check_create_from_data(self, ext): ++ testfile = support.findfile('python.' + ext, subdir='imghdrdata') ++ with open(testfile, 'rb') as f: ++ data = f.read() ++ image = tkinter.PhotoImage('::img::test', master=self.root, ++ data=data) ++ self.assertEqual(str(image), '::img::test') ++ self.assertEqual(image.type(), 'photo') ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ self.assertEqual(image['data'], data if self.wantobjects ++ else data.decode('latin1')) ++ self.assertEqual(image['file'], '') ++ self.assertIn('::img::test', self.root.image_names()) ++ del image ++ self.assertNotIn('::img::test', self.root.image_names()) ++ ++ def test_create_from_ppm_file(self): ++ self.check_create_from_file('ppm') ++ ++ @unittest.skip('issue #21580') ++ def test_create_from_ppm_data(self): ++ self.check_create_from_data('ppm') ++ ++ def test_create_from_pgm_file(self): ++ self.check_create_from_file('pgm') ++ ++ @unittest.skip('issue #21580') ++ def test_create_from_pgm_data(self): ++ self.check_create_from_data('pgm') ++ ++ def test_create_from_gif_file(self): ++ self.check_create_from_file('gif') ++ ++ @unittest.skip('issue #21580') ++ def test_create_from_gif_data(self): ++ self.check_create_from_data('gif') ++ ++ @requires_tcl(8, 6) ++ def test_create_from_png_file(self): ++ self.check_create_from_file('png') ++ ++ @unittest.skip('issue #21580') ++ @requires_tcl(8, 6) ++ def test_create_from_png_data(self): ++ self.check_create_from_data('png') ++ ++ @unittest.skip('issue #21580') ++ def test_configure_data(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['data'], '') ++ with open(self.testfile, 'rb') as f: ++ data = f.read() ++ image.configure(data=data) ++ self.assertEqual(image['data'], data if self.wantobjects ++ else data.decode('latin1')) ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ ++ def test_configure_format(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['format'], '') ++ image.configure(file=self.testfile, format='gif') ++ self.assertEqual(image['format'], ('gif',) if self.wantobjects ++ else 'gif') ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ ++ def test_configure_file(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['file'], '') ++ image.configure(file=self.testfile) ++ self.assertEqual(image['file'], self.testfile) ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ ++ def test_configure_gamma(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['gamma'], '1.0') ++ image.configure(gamma=2.0) ++ self.assertEqual(image['gamma'], '2.0') ++ ++ def test_configure_width_height(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['width'], '0') ++ self.assertEqual(image['height'], '0') ++ image.configure(width=20) ++ image.configure(height=10) ++ self.assertEqual(image['width'], '20') ++ self.assertEqual(image['height'], '10') ++ self.assertEqual(image.width(), 20) ++ self.assertEqual(image.height(), 10) ++ ++ def test_configure_palette(self): ++ image = tkinter.PhotoImage('::img::test', master=self.root) ++ self.assertEqual(image['palette'], '') ++ image.configure(palette=256) ++ self.assertEqual(image['palette'], '256') ++ image.configure(palette='3/4/2') ++ self.assertEqual(image['palette'], '3/4/2') ++ ++ def test_blank(self): ++ image = self.create() ++ image.blank() ++ self.assertEqual(image.width(), 16) ++ self.assertEqual(image.height(), 16) ++ self.assertEqual(image.get(4, 6), self.colorlist(0, 0, 0)) ++ ++ def test_copy(self): ++ image = self.create() ++ image2 = image.copy() ++ self.assertEqual(image2.width(), 16) ++ self.assertEqual(image2.height(), 16) ++ self.assertEqual(image.get(4, 6), image.get(4, 6)) ++ ++ def test_subsample(self): ++ image = self.create() ++ image2 = image.subsample(2, 3) ++ self.assertEqual(image2.width(), 8) ++ self.assertEqual(image2.height(), 6) ++ self.assertEqual(image2.get(2, 2), image.get(4, 6)) ++ ++ image2 = image.subsample(2) ++ self.assertEqual(image2.width(), 8) ++ self.assertEqual(image2.height(), 8) ++ self.assertEqual(image2.get(2, 3), image.get(4, 6)) ++ ++ def test_zoom(self): ++ image = self.create() ++ image2 = image.zoom(2, 3) ++ self.assertEqual(image2.width(), 32) ++ self.assertEqual(image2.height(), 48) ++ self.assertEqual(image2.get(8, 18), image.get(4, 6)) ++ self.assertEqual(image2.get(9, 20), image.get(4, 6)) ++ ++ image2 = image.zoom(2) ++ self.assertEqual(image2.width(), 32) ++ self.assertEqual(image2.height(), 32) ++ self.assertEqual(image2.get(8, 12), image.get(4, 6)) ++ self.assertEqual(image2.get(9, 13), image.get(4, 6)) ++ ++ def test_put(self): ++ image = self.create() ++ image.put('{red green} {blue yellow}', to=(4, 6)) ++ self.assertEqual(image.get(4, 6), self.colorlist(255, 0, 0)) ++ self.assertEqual(image.get(5, 6), ++ self.colorlist(0, 128 if tkinter.TkVersion >= 8.6 ++ else 255, 0)) ++ self.assertEqual(image.get(4, 7), self.colorlist(0, 0, 255)) ++ self.assertEqual(image.get(5, 7), self.colorlist(255, 255, 0)) ++ ++ image.put((('#f00', '#00ff00'), ('#000000fff', '#ffffffff0000'))) ++ self.assertEqual(image.get(0, 0), self.colorlist(255, 0, 0)) ++ self.assertEqual(image.get(1, 0), self.colorlist(0, 255, 0)) ++ self.assertEqual(image.get(0, 1), self.colorlist(0, 0, 255)) ++ self.assertEqual(image.get(1, 1), self.colorlist(255, 255, 0)) ++ ++ def test_get(self): ++ image = self.create() ++ self.assertEqual(image.get(4, 6), self.colorlist(62, 116, 162)) ++ self.assertEqual(image.get(0, 0), self.colorlist(0, 0, 0)) ++ self.assertEqual(image.get(15, 15), self.colorlist(0, 0, 0)) ++ self.assertRaises(tkinter.TclError, image.get, -1, 0) ++ self.assertRaises(tkinter.TclError, image.get, 0, -1) ++ self.assertRaises(tkinter.TclError, image.get, 16, 15) ++ self.assertRaises(tkinter.TclError, image.get, 15, 16) ++ ++ def test_write(self): ++ image = self.create() ++ self.addCleanup(support.unlink, support.TESTFN) ++ ++ image.write(support.TESTFN) ++ image2 = tkinter.PhotoImage('::img::test2', master=self.root, ++ format='ppm', ++ file=support.TESTFN) ++ self.assertEqual(str(image2), '::img::test2') ++ self.assertEqual(image2.type(), 'photo') ++ self.assertEqual(image2.width(), 16) ++ self.assertEqual(image2.height(), 16) ++ self.assertEqual(image2.get(0, 0), image.get(0, 0)) ++ self.assertEqual(image2.get(15, 8), image.get(15, 8)) ++ ++ image.write(support.TESTFN, format='gif', from_coords=(4, 6, 6, 9)) ++ image3 = tkinter.PhotoImage('::img::test3', master=self.root, ++ format='gif', ++ file=support.TESTFN) ++ self.assertEqual(str(image3), '::img::test3') ++ self.assertEqual(image3.type(), 'photo') ++ self.assertEqual(image3.width(), 2) ++ self.assertEqual(image3.height(), 3) ++ self.assertEqual(image3.get(0, 0), image.get(4, 6)) ++ self.assertEqual(image3.get(1, 2), image.get(5, 8)) ++ ++ ++tests_gui = (MiscTest, BitmapImageTest, PhotoImageTest,) ++ ++if __name__ == "__main__": ++ support.run_unittest(*tests_gui) +diff -r c0e311e010fc Lib/tkinter/test/test_tkinter/test_widgets.py +--- a/Lib/tkinter/test/test_tkinter/test_widgets.py ++++ b/Lib/tkinter/test/test_tkinter/test_widgets.py +@@ -1,5 +1,6 @@ + import unittest + import tkinter ++from tkinter import TclError + import os + import sys + from test.support import requires +@@ -466,11 +467,7 @@ + + def test_bbox(self): + widget = self.create() +- bbox = widget.bbox(0) +- self.assertEqual(len(bbox), 4) +- for item in bbox: +- self.assertIsInstance(item, int) +- ++ self.assertIsBoundingBox(widget.bbox(0)) + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) + self.assertRaises(TypeError, widget.bbox) +@@ -623,11 +620,7 @@ + + def test_bbox(self): + widget = self.create() +- bbox = widget.bbox('1.1') +- self.assertEqual(len(bbox), 4) +- for item in bbox: +- self.assertIsInstance(item, int) +- ++ self.assertIsBoundingBox(widget.bbox('1.1')) + self.assertIsNone(widget.bbox('end')) + self.assertRaises(tkinter.TclError, widget.bbox, 'noindex') + self.assertRaises(tkinter.TclError, widget.bbox, None) +@@ -730,6 +723,101 @@ + widget = self.create() + self.checkEnumParam(widget, 'state', 'disabled', 'normal') + ++ def test_itemconfigure(self): ++ widget = self.create() ++ with self.assertRaisesRegex(TclError, 'item number "0" out of range'): ++ widget.itemconfigure(0) ++ colors = 'red orange yellow green blue white violet'.split() ++ widget.insert('end', *colors) ++ for i, color in enumerate(colors): ++ widget.itemconfigure(i, background=color) ++ with self.assertRaises(TypeError): ++ widget.itemconfigure() ++ with self.assertRaisesRegex(TclError, 'bad listbox index "red"'): ++ widget.itemconfigure('red') ++ self.assertEqual(widget.itemconfigure(0, 'background'), ++ ('background', 'background', 'Background', '', 'red')) ++ self.assertEqual(widget.itemconfigure('end', 'background'), ++ ('background', 'background', 'Background', '', 'violet')) ++ self.assertEqual(widget.itemconfigure('@0,0', 'background'), ++ ('background', 'background', 'Background', '', 'red')) ++ ++ d = widget.itemconfigure(0) ++ self.assertIsInstance(d, dict) ++ for k, v in d.items(): ++ self.assertIn(len(v), (2, 5)) ++ if len(v) == 5: ++ self.assertEqual(v, widget.itemconfigure(0, k)) ++ self.assertEqual(v[4], widget.itemcget(0, k)) ++ ++ def check_itemconfigure(self, name, value): ++ widget = self.create() ++ widget.insert('end', 'a', 'b', 'c', 'd') ++ widget.itemconfigure(0, **{name: value}) ++ self.assertEqual(widget.itemconfigure(0, name)[4], value) ++ self.assertEqual(widget.itemcget(0, name), value) ++ with self.assertRaisesRegex(TclError, 'unknown color name "spam"'): ++ widget.itemconfigure(0, **{name: 'spam'}) ++ ++ def test_itemconfigure_background(self): ++ self.check_itemconfigure('background', '#ff0000') ++ ++ def test_itemconfigure_bg(self): ++ self.check_itemconfigure('bg', '#ff0000') ++ ++ def test_itemconfigure_fg(self): ++ self.check_itemconfigure('fg', '#110022') ++ ++ def test_itemconfigure_foreground(self): ++ self.check_itemconfigure('foreground', '#110022') ++ ++ def test_itemconfigure_selectbackground(self): ++ self.check_itemconfigure('selectbackground', '#110022') ++ ++ def test_itemconfigure_selectforeground(self): ++ self.check_itemconfigure('selectforeground', '#654321') ++ ++ def test_box(self): ++ lb = self.create() ++ lb.insert(0, *('el%d' % i for i in range(8))) ++ lb.pack() ++ self.assertIsBoundingBox(lb.bbox(0)) ++ self.assertIsNone(lb.bbox(-1)) ++ self.assertIsNone(lb.bbox(10)) ++ self.assertRaises(TclError, lb.bbox, 'noindex') ++ self.assertRaises(TclError, lb.bbox, None) ++ self.assertRaises(TypeError, lb.bbox) ++ self.assertRaises(TypeError, lb.bbox, 0, 1) ++ ++ def test_curselection(self): ++ lb = self.create() ++ lb.insert(0, *('el%d' % i for i in range(8))) ++ lb.selection_clear(0, tkinter.END) ++ lb.selection_set(2, 4) ++ lb.selection_set(6) ++ self.assertEqual(lb.curselection(), (2, 3, 4, 6)) ++ self.assertRaises(TypeError, lb.curselection, 0) ++ ++ def test_get(self): ++ lb = self.create() ++ lb.insert(0, *('el%d' % i for i in range(8))) ++ self.assertEqual(lb.get(0), 'el0') ++ self.assertEqual(lb.get(3), 'el3') ++ self.assertEqual(lb.get('end'), 'el7') ++ self.assertEqual(lb.get(8), '') ++ self.assertEqual(lb.get(-1), '') ++ self.assertEqual(lb.get(3, 5), ('el3', 'el4', 'el5')) ++ self.assertEqual(lb.get(5, 'end'), ('el5', 'el6', 'el7')) ++ self.assertEqual(lb.get(5, 0), ()) ++ self.assertEqual(lb.get(0, 0), ('el0',)) ++ self.assertRaises(TclError, lb.get, 'noindex') ++ self.assertRaises(TclError, lb.get, None) ++ self.assertRaises(TypeError, lb.get) ++ self.assertRaises(TclError, lb.get, 'end', 'noindex') ++ self.assertRaises(TypeError, lb.get, 1, 2, 3) ++ self.assertRaises(TclError, lb.get, 2.4) ++ ++ + @add_standard_options(PixelSizeTests, StandardOptionsTests) + class ScaleTest(AbstractWidgetTest, unittest.TestCase): + OPTIONS = ( +@@ -828,6 +916,24 @@ + self.checkEnumParam(widget, 'orient', 'vertical', 'horizontal', + errmsg='bad orientation "{}": must be vertical or horizontal') + ++ def test_activate(self): ++ sb = self.create() ++ for e in ('arrow1', 'slider', 'arrow2'): ++ sb.activate(e) ++ sb.activate('') ++ self.assertRaises(TypeError, sb.activate) ++ self.assertRaises(TypeError, sb.activate, 'arrow1', 'arrow2') ++ ++ def test_set(self): ++ sb = self.create() ++ sb.set(0.2, 0.4) ++ self.assertEqual(sb.get(), (0.2, 0.4)) ++ self.assertRaises(TclError, sb.set, 'abc', 'def') ++ self.assertRaises(TclError, sb.set, 0.6, 'def') ++ self.assertRaises(TclError, sb.set, 0.6, None) ++ self.assertRaises(TclError, sb.set, 0.6) ++ self.assertRaises(TclError, sb.set, 0.6, 0.7, 0.8) ++ + + @add_standard_options(StandardOptionsTests) + class PanedWindowTest(AbstractWidgetTest, unittest.TestCase): +@@ -887,6 +993,105 @@ + self.checkPixelsParam(widget, 'width', 402, 403.4, 404.6, -402, 0, '5i', + conv=noconv) + ++ def create2(self): ++ p = self.create() ++ b = tkinter.Button(p) ++ c = tkinter.Button(p) ++ p.add(b) ++ p.add(c) ++ return p, b, c ++ ++ def test_paneconfigure(self): ++ p, b, c = self.create2() ++ self.assertRaises(TypeError, p.paneconfigure) ++ d = p.paneconfigure(b) ++ self.assertIsInstance(d, dict) ++ for k, v in d.items(): ++ self.assertEqual(len(v), 5) ++ self.assertEqual(v, p.paneconfigure(b, k)) ++ self.assertEqual(v[4], p.panecget(b, k)) ++ ++ def check_paneconfigure(self, p, b, name, value, expected, stringify=False): ++ conv = lambda x: x ++ if not self.wantobjects or stringify: ++ expected = str(expected) ++ if self.wantobjects and stringify: ++ conv = str ++ p.paneconfigure(b, **{name: value}) ++ self.assertEqual(conv(p.paneconfigure(b, name)[4]), expected) ++ self.assertEqual(conv(p.panecget(b, name)), expected) ++ ++ def check_paneconfigure_bad(self, p, b, name, msg): ++ with self.assertRaisesRegex(TclError, msg): ++ p.paneconfigure(b, **{name: 'badValue'}) ++ ++ def test_paneconfigure_after(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'after', c, str(c)) ++ self.check_paneconfigure_bad(p, b, 'after', ++ 'bad window path name "badValue"') ++ ++ def test_paneconfigure_before(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'before', c, str(c)) ++ self.check_paneconfigure_bad(p, b, 'before', ++ 'bad window path name "badValue"') ++ ++ def test_paneconfigure_height(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'height', 10, 10, ++ stringify=tcl_version < (8, 5)) ++ self.check_paneconfigure_bad(p, b, 'height', ++ 'bad screen distance "badValue"') ++ ++ @requires_tcl(8, 5) ++ def test_paneconfigure_hide(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'hide', False, 0) ++ self.check_paneconfigure_bad(p, b, 'hide', ++ 'expected boolean value but got "badValue"') ++ ++ def test_paneconfigure_minsize(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'minsize', 10, 10) ++ self.check_paneconfigure_bad(p, b, 'minsize', ++ 'bad screen distance "badValue"') ++ ++ def test_paneconfigure_padx(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'padx', 1.3, 1) ++ self.check_paneconfigure_bad(p, b, 'padx', ++ 'bad screen distance "badValue"') ++ ++ def test_paneconfigure_pady(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'pady', 1.3, 1) ++ self.check_paneconfigure_bad(p, b, 'pady', ++ 'bad screen distance "badValue"') ++ ++ def test_paneconfigure_sticky(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'sticky', 'nsew', 'nesw') ++ self.check_paneconfigure_bad(p, b, 'sticky', ++ 'bad stickyness value "badValue": must ' ++ 'be a string containing zero or more of ' ++ 'n, e, s, and w') ++ ++ @requires_tcl(8, 5) ++ def test_paneconfigure_stretch(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'stretch', 'alw', 'always') ++ self.check_paneconfigure_bad(p, b, 'stretch', ++ 'bad stretch "badValue": must be ' ++ 'always, first, last, middle, or never') ++ ++ def test_paneconfigure_width(self): ++ p, b, c = self.create2() ++ self.check_paneconfigure(p, b, 'width', 10, 10, ++ stringify=tcl_version < (8, 5)) ++ self.check_paneconfigure_bad(p, b, 'width', ++ 'bad screen distance "badValue"') ++ + + @add_standard_options(StandardOptionsTests) + class MenuTest(AbstractWidgetTest, unittest.TestCase): +@@ -923,6 +1128,39 @@ + self.checkEnumParam(widget, 'type', + 'normal', 'tearoff', 'menubar') + ++ def test_entryconfigure(self): ++ m1 = self.create() ++ m1.add_command(label='test') ++ self.assertRaises(TypeError, m1.entryconfigure) ++ with self.assertRaisesRegex(TclError, 'bad menu entry index "foo"'): ++ m1.entryconfigure('foo') ++ d = m1.entryconfigure(1) ++ self.assertIsInstance(d, dict) ++ for k, v in d.items(): ++ self.assertIsInstance(k, str) ++ self.assertIsInstance(v, tuple) ++ self.assertEqual(len(v), 5) ++ self.assertEqual(v[0], k) ++ self.assertEqual(m1.entrycget(1, k), v[4]) ++ m1.destroy() ++ ++ def test_entryconfigure_label(self): ++ m1 = self.create() ++ m1.add_command(label='test') ++ self.assertEqual(m1.entrycget(1, 'label'), 'test') ++ m1.entryconfigure(1, label='changed') ++ self.assertEqual(m1.entrycget(1, 'label'), 'changed') ++ ++ def test_entryconfigure_variable(self): ++ m1 = self.create() ++ v1 = tkinter.BooleanVar(self.root) ++ v2 = tkinter.BooleanVar(self.root) ++ m1.add_checkbutton(variable=v1, onvalue=True, offvalue=False, ++ label='Nonsense') ++ self.assertEqual(str(m1.entrycget(1, 'variable')), str(v1)) ++ m1.entryconfigure(1, variable=v2) ++ self.assertEqual(str(m1.entrycget(1, 'variable')), str(v2)) ++ + + @add_standard_options(PixelSizeTests, StandardOptionsTests) + class MessageTest(AbstractWidgetTest, unittest.TestCase): +diff -r c0e311e010fc Lib/tkinter/test/test_ttk/test_functions.py +--- a/Lib/tkinter/test/test_ttk/test_functions.py ++++ b/Lib/tkinter/test/test_ttk/test_functions.py +@@ -1,7 +1,19 @@ + # -*- encoding: utf-8 -*- + import unittest ++import tkinter + from tkinter import ttk + ++class MockTkApp: ++ ++ def splitlist(self, arg): ++ if isinstance(arg, tuple): ++ return arg ++ return arg.split(':') ++ ++ def wantobjects(self): ++ return True ++ ++ + class MockTclObj(object): + typename = 'test' + +@@ -352,20 +364,22 @@ + + + def test_list_from_layouttuple(self): ++ tk = MockTkApp() ++ + # empty layout tuple +- self.assertFalse(ttk._list_from_layouttuple(())) ++ self.assertFalse(ttk._list_from_layouttuple(tk, ())) + + # shortest layout tuple +- self.assertEqual(ttk._list_from_layouttuple(('name', )), ++ self.assertEqual(ttk._list_from_layouttuple(tk, ('name', )), + [('name', {})]) + + # not so interesting ltuple + sample_ltuple = ('name', '-option', 'value') +- self.assertEqual(ttk._list_from_layouttuple(sample_ltuple), ++ self.assertEqual(ttk._list_from_layouttuple(tk, sample_ltuple), + [('name', {'option': 'value'})]) + + # empty children +- self.assertEqual(ttk._list_from_layouttuple( ++ self.assertEqual(ttk._list_from_layouttuple(tk, + ('something', '-children', ())), + [('something', {'children': []})] + ) +@@ -378,7 +392,7 @@ + ) + ) + ) +- self.assertEqual(ttk._list_from_layouttuple(ltuple), ++ self.assertEqual(ttk._list_from_layouttuple(tk, ltuple), + [('name', {'option': 'niceone', 'children': + [('otherone', {'otheropt': 'othervalue', 'children': + [('child', {})] +@@ -387,29 +401,35 @@ + ) + + # bad tuples +- self.assertRaises(ValueError, ttk._list_from_layouttuple, ++ self.assertRaises(ValueError, ttk._list_from_layouttuple, tk, + ('name', 'no_minus')) +- self.assertRaises(ValueError, ttk._list_from_layouttuple, ++ self.assertRaises(ValueError, ttk._list_from_layouttuple, tk, + ('name', 'no_minus', 'value')) +- self.assertRaises(ValueError, ttk._list_from_layouttuple, ++ self.assertRaises(ValueError, ttk._list_from_layouttuple, tk, + ('something', '-children')) # no children +- import tkinter +- if not tkinter._default_root or tkinter._default_root.wantobjects(): +- self.assertRaises(ValueError, ttk._list_from_layouttuple, +- ('something', '-children', 'value')) # invalid children + + + def test_val_or_dict(self): +- def func(opt, val=None): ++ def func(res, opt=None, val=None): ++ if opt is None: ++ return res + if val is None: + return "test val" + return (opt, val) + +- options = {'test': None} +- self.assertEqual(ttk._val_or_dict(options, func), "test val") ++ tk = MockTkApp() ++ tk.call = func + +- options = {'test': 3} +- self.assertEqual(ttk._val_or_dict(options, func), options) ++ self.assertEqual(ttk._val_or_dict(tk, {}, '-test:3'), ++ {'test': '3'}) ++ self.assertEqual(ttk._val_or_dict(tk, {}, ('-test', 3)), ++ {'test': 3}) ++ ++ self.assertEqual(ttk._val_or_dict(tk, {'test': None}, 'x:y'), ++ 'test val') ++ ++ self.assertEqual(ttk._val_or_dict(tk, {'test': 3}, 'x:y'), ++ {'test': 3}) + + + def test_convert_stringval(self): +diff -r c0e311e010fc Lib/tkinter/test/test_ttk/test_widgets.py +--- a/Lib/tkinter/test/test_ttk/test_widgets.py ++++ b/Lib/tkinter/test/test_ttk/test_widgets.py +@@ -460,10 +460,7 @@ + + + def test_bbox(self): +- self.assertEqual(len(self.entry.bbox(0)), 4) +- for item in self.entry.bbox(0): +- self.assertIsInstance(item, int) +- ++ self.assertIsBoundingBox(self.entry.bbox(0)) + self.assertRaises(tkinter.TclError, self.entry.bbox, 'noindex') + self.assertRaises(tkinter.TclError, self.entry.bbox, None) + +@@ -1216,12 +1213,7 @@ + self.assertTrue(children) + + bbox = self.tv.bbox(children[0]) +- self.assertEqual(len(bbox), 4) +- self.assertIsInstance(bbox, tuple) +- for item in bbox: +- if not isinstance(item, int): +- self.fail("Invalid bounding box: %s" % bbox) +- break ++ self.assertIsBoundingBox(bbox) + + # compare width in bboxes + self.tv['columns'] = ['test'] +diff -r c0e311e010fc Lib/tkinter/test/widget_tests.py +--- a/Lib/tkinter/test/widget_tests.py ++++ b/Lib/tkinter/test/widget_tests.py +@@ -202,6 +202,16 @@ + def checkVariableParam(self, widget, name, var): + self.checkParam(widget, name, var, conv=str) + ++ def assertIsBoundingBox(self, bbox): ++ self.assertIsNotNone(bbox) ++ self.assertIsInstance(bbox, tuple) ++ if len(bbox) != 4: ++ self.fail('Invalid bounding box: %r' % (bbox,)) ++ for item in bbox: ++ if not isinstance(item, int): ++ self.fail('Invalid bounding box: %r' % (bbox,)) ++ break ++ + + class StandardOptionsTests: + STANDARD_OPTIONS = ( +diff -r c0e311e010fc Lib/tkinter/ttk.py +--- a/Lib/tkinter/ttk.py ++++ b/Lib/tkinter/ttk.py +@@ -272,9 +272,10 @@ + it = iter(nval) + return [_flatten(spec) for spec in zip(it, it)] + +-def _list_from_layouttuple(ltuple): ++def _list_from_layouttuple(tk, ltuple): + """Construct a list from the tuple returned by ttk::layout, this is + somewhat the reverse of _format_layoutlist.""" ++ ltuple = tk.splitlist(ltuple) + res = [] + + indx = 0 +@@ -293,17 +294,14 @@ + indx += 2 + + if opt == 'children': +- if (tkinter._default_root and +- not tkinter._default_root.wantobjects()): +- val = tkinter._default_root.splitlist(val) +- val = _list_from_layouttuple(val) ++ val = _list_from_layouttuple(tk, val) + + opts[opt] = val + + return res + +-def _val_or_dict(options, func, *args): +- """Format options then call func with args and options and return ++def _val_or_dict(tk, options, *args): ++ """Format options then call Tk command with args and options and return + the appropriate result. + + If no option is specified, a dict is returned. If a option is +@@ -311,14 +309,12 @@ + Otherwise, the function just sets the passed options and the caller + shouldn't be expecting a return value anyway.""" + options = _format_optdict(options) +- res = func(*(args + options)) ++ res = tk.call(*(args + options)) + + if len(options) % 2: # option specified without a value, return its value + return res + +- if tkinter._default_root: +- res = tkinter._default_root.splitlist(res) +- return _dict_from_tcltuple(res) ++ return _dict_from_tcltuple(tk.splitlist(res)) + + def _convert_stringval(value): + """Converts a value to, hopefully, a more appropriate Python object.""" +@@ -396,7 +392,7 @@ + a sequence identifying the value for that option.""" + if query_opt is not None: + kw[query_opt] = None +- return _val_or_dict(kw, self.tk.call, self._name, "configure", style) ++ return _val_or_dict(self.tk, kw, self._name, "configure", style) + + + def map(self, style, query_opt=None, **kw): +@@ -411,8 +407,8 @@ + return _list_from_statespec(self.tk.splitlist( + self.tk.call(self._name, "map", style, '-%s' % query_opt))) + +- return _dict_from_tcltuple( +- self.tk.call(self._name, "map", style, *(_format_mapdict(kw)))) ++ return _dict_from_tcltuple(self.tk.splitlist( ++ self.tk.call(self._name, "map", style, *(_format_mapdict(kw))))) + + + def lookup(self, style, option, state=None, default=None): +@@ -466,8 +462,8 @@ + lspec = "null" # could be any other word, but this may make sense + # when calling layout(style) later + +- return _list_from_layouttuple(self.tk.splitlist( +- self.tk.call(self._name, "layout", style, lspec))) ++ return _list_from_layouttuple(self.tk, ++ self.tk.call(self._name, "layout", style, lspec)) + + + def element_create(self, elementname, etype, *args, **kw): +@@ -907,7 +903,7 @@ + options to the corresponding values.""" + if option is not None: + kw[option] = None +- return _val_or_dict(kw, self.tk.call, self._w, "tab", tab_id) ++ return _val_or_dict(self.tk, kw, self._w, "tab", tab_id) + + + def tabs(self): +@@ -984,7 +980,7 @@ + Otherwise, sets the options to the corresponding values.""" + if option is not None: + kw[option] = None +- return _val_or_dict(kw, self.tk.call, self._w, "pane", pane) ++ return _val_or_dict(self.tk, kw, self._w, "pane", pane) + + + def sashpos(self, index, newpos=None): +@@ -1223,7 +1219,7 @@ + Otherwise, sets the options to the corresponding values.""" + if option is not None: + kw[option] = None +- return _val_or_dict(kw, self.tk.call, self._w, "column", column) ++ return _val_or_dict(self.tk, kw, self._w, "column", column) + + + def delete(self, *items): +@@ -1282,7 +1278,7 @@ + if option is not None: + kw[option] = None + +- return _val_or_dict(kw, self.tk.call, self._w, 'heading', column) ++ return _val_or_dict(self.tk, kw, self._w, 'heading', column) + + + def identify(self, component, x, y): +@@ -1361,7 +1357,7 @@ + values as given by kw.""" + if option is not None: + kw[option] = None +- return _val_or_dict(kw, self.tk.call, self._w, "item", item) ++ return _val_or_dict(self.tk, kw, self._w, "item", item) + + + def move(self, item, parent, index): +@@ -1456,7 +1452,7 @@ + values for the given tagname.""" + if option is not None: + kw[option] = None +- return _val_or_dict(kw, self.tk.call, self._w, "tag", "configure", ++ return _val_or_dict(self.tk, kw, self._w, "tag", "configure", + tagname) + + +diff -r c0e311e010fc Lib/turtle.py +--- a/Lib/turtle.py ++++ b/Lib/turtle.py +@@ -140,7 +140,7 @@ + _tg_utilities = ['write_docstringdict', 'done'] + + __all__ = (_tg_classes + _tg_screen_functions + _tg_turtle_functions + +- _tg_utilities) # + _math_functions) ++ _tg_utilities + ['Terminator']) # + _math_functions) + + _alias_list = ['addshape', 'backward', 'bk', 'fd', 'ht', 'lt', 'pd', 'pos', + 'pu', 'rt', 'seth', 'setpos', 'setposition', 'st', +@@ -2594,7 +2594,7 @@ + Example (for a Turtle instance named turtle): + >>> turtle.setundobuffer(42) + """ +- if size is None: ++ if size is None or size <= 0: + self.undobuffer = None + else: + self.undobuffer = Tbuffer(size) +@@ -2945,7 +2945,7 @@ + self._stretchfactor = a11, a22 + self._shearfactor = a12/a22 + self._tilt = alfa +- self._update() ++ self.pen(resizemode="user") + + + def _polytrafo(self, poly): +diff -r c0e311e010fc Lib/turtledemo/__main__.py +--- a/Lib/turtledemo/__main__.py ++++ b/Lib/turtledemo/__main__.py +@@ -27,88 +27,54 @@ + return [entry[:-3] for entry in os.listdir(demo_dir) if + entry.endswith(".py") and entry[0] != '_'] + +-def showDemoHelp(): +- view_file(demo.root, "Help on turtleDemo", +- os.path.join(demo_dir, "demohelp.txt")) +- +-def showAboutDemo(): +- view_file(demo.root, "About turtleDemo", +- os.path.join(demo_dir, "about_turtledemo.txt")) +- +-def showAboutTurtle(): +- view_file(demo.root, "About the new turtle module.", +- os.path.join(demo_dir, "about_turtle.txt")) ++help_entries = ( # (help_label, help_file) ++ ('Turtledemo help', "demohelp.txt"), ++ ('About turtledemo', "about_turtledemo.txt"), ++ ('About turtle module', "about_turtle.txt"), ++ ) + + class DemoWindow(object): + +- def __init__(self, filename=None): #, root=None): ++ def __init__(self, filename=None): + self.root = root = turtle._root = Tk() ++ root.title('Python turtle-graphics examples') + root.wm_protocol("WM_DELETE_WINDOW", self._destroy) + +- ################# ++ root.grid_rowconfigure(1, weight=1) ++ root.grid_columnconfigure(0, weight=1) ++ root.grid_columnconfigure(1, minsize=90, weight=1) ++ root.grid_columnconfigure(2, minsize=90, weight=1) ++ root.grid_columnconfigure(3, minsize=90, weight=1) ++ + self.mBar = Frame(root, relief=RAISED, borderwidth=2) +- self.mBar.pack(fill=X) +- + self.ExamplesBtn = self.makeLoadDemoMenu() + self.OptionsBtn = self.makeHelpMenu() +- self.mBar.tk_menuBar(self.ExamplesBtn, self.OptionsBtn) #, QuitBtn) ++ self.mBar.grid(row=0, columnspan=4, sticky='news') + +- root.title('Python turtle-graphics examples') +- ################# +- self.left_frame = left_frame = Frame(root) +- self.text_frame = text_frame = Frame(left_frame) +- self.vbar = vbar =Scrollbar(text_frame, name='vbar') +- self.text = text = Text(text_frame, +- name='text', padx=5, wrap='none', +- width=45) +- vbar['command'] = text.yview +- vbar.pack(side=LEFT, fill=Y) +- ##################### +- self.hbar = hbar =Scrollbar(text_frame, name='hbar', orient=HORIZONTAL) +- hbar['command'] = text.xview +- hbar.pack(side=BOTTOM, fill=X) +- ##################### +- text['yscrollcommand'] = vbar.set +- text.config(font=txtfont) +- text.config(xscrollcommand=hbar.set) +- text.pack(side=LEFT, fill=Y, expand=1) +- ##################### +- self.output_lbl = Label(left_frame, height= 1,text=" --- ", bg = "#ddf", +- font = ("Arial", 16, 'normal')) +- self.output_lbl.pack(side=BOTTOM, expand=0, fill=X) +- ##################### +- text_frame.pack(side=LEFT, fill=BOTH, expand=0) +- left_frame.pack(side=LEFT, fill=BOTH, expand=0) +- self.graph_frame = g_frame = Frame(root) ++ pane = PanedWindow(orient=HORIZONTAL, sashwidth=5, ++ sashrelief=SOLID, bg='#ddd') ++ pane.add(self.makeTextFrame(pane)) ++ pane.add(self.makeGraphFrame(pane)) ++ pane.grid(row=1, columnspan=4, sticky='news') + +- turtle._Screen._root = g_frame +- turtle._Screen._canvas = turtle.ScrolledCanvas(g_frame, 800, 600, 1000, 800) +- #xturtle.Screen._canvas.pack(expand=1, fill="both") +- self.screen = _s_ = turtle.Screen() +-##### +- turtle.TurtleScreen.__init__(_s_, _s_._canvas) +-##### +- self.scanvas = _s_._canvas +- #xturtle.RawTurtle.canvases = [self.scanvas] +- turtle.RawTurtle.screens = [_s_] ++ self.output_lbl = Label(root, height= 1, text=" --- ", bg="#ddf", ++ font=("Arial", 16, 'normal'), borderwidth=2, ++ relief=RIDGE) ++ self.start_btn = Button(root, text=" START ", font=btnfont, ++ fg="white", disabledforeground = "#fed", ++ command=self.startDemo) ++ self.stop_btn = Button(root, text=" STOP ", font=btnfont, ++ fg="white", disabledforeground = "#fed", ++ command=self.stopIt) ++ self.clear_btn = Button(root, text=" CLEAR ", font=btnfont, ++ fg="white", disabledforeground="#fed", ++ command = self.clearCanvas) ++ self.output_lbl.grid(row=2, column=0, sticky='news', padx=(0,5)) ++ self.start_btn.grid(row=2, column=1, sticky='ew') ++ self.stop_btn.grid(row=2, column=2, sticky='ew') ++ self.clear_btn.grid(row=2, column=3, sticky='ew') + +- self.scanvas.pack(side=TOP, fill=BOTH, expand=1) +- +- self.btn_frame = btn_frame = Frame(g_frame, height=100) +- self.start_btn = Button(btn_frame, text=" START ", font=btnfont, fg = "white", +- disabledforeground = "#fed", command=self.startDemo) +- self.start_btn.pack(side=LEFT, fill=X, expand=1) +- self.stop_btn = Button(btn_frame, text=" STOP ", font=btnfont, fg = "white", +- disabledforeground = "#fed", command = self.stopIt) +- self.stop_btn.pack(side=LEFT, fill=X, expand=1) +- self.clear_btn = Button(btn_frame, text=" CLEAR ", font=btnfont, fg = "white", +- disabledforeground = "#fed", command = self.clearCanvas) +- self.clear_btn.pack(side=LEFT, fill=X, expand=1) +- +- self.btn_frame.pack(side=TOP, fill=BOTH, expand=0) +- self.graph_frame.pack(side=TOP, fill=BOTH, expand=1) +- +- Percolator(text).insertfilter(ColorDelegator()) ++ Percolator(self.text).insertfilter(ColorDelegator()) + self.dirty = False + self.exitflag = False + if filename: +@@ -117,9 +83,46 @@ + "Choose example from menu", "black") + self.state = STARTUP + +- def _destroy(self): +- self.root.destroy() +- sys.exit() ++ ++ def onResize(self, event): ++ cwidth = self._canvas.winfo_width() ++ cheight = self._canvas.winfo_height() ++ self._canvas.xview_moveto(0.5*(self.canvwidth-cwidth)/self.canvwidth) ++ self._canvas.yview_moveto(0.5*(self.canvheight-cheight)/self.canvheight) ++ ++ def makeTextFrame(self, root): ++ self.text_frame = text_frame = Frame(root) ++ self.text = text = Text(text_frame, name='text', padx=5, ++ wrap='none', width=45) ++ ++ self.vbar = vbar = Scrollbar(text_frame, name='vbar') ++ vbar['command'] = text.yview ++ vbar.pack(side=LEFT, fill=Y) ++ self.hbar = hbar = Scrollbar(text_frame, name='hbar', orient=HORIZONTAL) ++ hbar['command'] = text.xview ++ hbar.pack(side=BOTTOM, fill=X) ++ ++ text['font'] = txtfont ++ text['yscrollcommand'] = vbar.set ++ text['xscrollcommand'] = hbar.set ++ text.pack(side=LEFT, fill=BOTH, expand=1) ++ return text_frame ++ ++ def makeGraphFrame(self, root): ++ turtle._Screen._root = root ++ self.canvwidth = 1000 ++ self.canvheight = 800 ++ turtle._Screen._canvas = self._canvas = canvas = turtle.ScrolledCanvas( ++ root, 800, 600, self.canvwidth, self.canvheight) ++ canvas.adjustScrolls() ++ canvas._rootwindow.bind('', self.onResize) ++ canvas._canvas['borderwidth'] = 0 ++ ++ self.screen = _s_ = turtle.Screen() ++ turtle.TurtleScreen.__init__(_s_, _s_._canvas) ++ self.scanvas = _s_._canvas ++ turtle.RawTurtle.screens = [_s_] ++ return canvas + + def configGUI(self, menu, start, stop, clear, txt="", color="blue"): + self.ExamplesBtn.config(state=menu) +@@ -145,9 +148,9 @@ + + self.output_lbl.config(text=txt, fg=color) + +- + def makeLoadDemoMenu(self): +- CmdBtn = Menubutton(self.mBar, text='Examples', underline=0, font=menufont) ++ CmdBtn = Menubutton(self.mBar, text='Examples', ++ underline=0, font=menufont) + CmdBtn.pack(side=LEFT, padx="2m") + CmdBtn.menu = Menu(CmdBtn) + +@@ -167,12 +170,10 @@ + CmdBtn.pack(side=LEFT, padx='2m') + CmdBtn.menu = Menu(CmdBtn) + +- CmdBtn.menu.add_command(label='About turtle.py', font=menufont, +- command=showAboutTurtle) +- CmdBtn.menu.add_command(label='turtleDemo - Help', font=menufont, +- command=showDemoHelp) +- CmdBtn.menu.add_command(label='About turtleDemo', font=menufont, +- command=showAboutDemo) ++ for help_label, help_file in help_entries: ++ def show(help_label=help_label, help_file=help_file): ++ view_file(self.root, help_label, os.path.join(demo_dir, help_file)) ++ CmdBtn.menu.add_command(label=help_label, font=menufont, command=show) + + CmdBtn['menu'] = CmdBtn.menu + return CmdBtn +@@ -180,7 +181,6 @@ + def refreshCanvas(self): + if not self.dirty: return + self.screen.clear() +- #self.screen.mode("standard") + self.dirty=False + + def loadfile(self, filename): +@@ -238,29 +238,16 @@ + self.configGUI(NORMAL, NORMAL, DISABLED, DISABLED, + "STOPPED!", "red") + turtle.TurtleScreen._RUNNING = False +- #print "stopIT: exitflag = True" + else: + turtle.TurtleScreen._RUNNING = False +- #print "stopIt: exitflag = False" ++ ++ def _destroy(self): ++ self.root.destroy() ++ ++ ++def main(): ++ demo = DemoWindow() ++ demo.root.mainloop() + + if __name__ == '__main__': +- demo = DemoWindow() +- RUN = True +- while RUN: +- try: +- #print("ENTERING mainloop") +- demo.root.mainloop() +- except AttributeError: +- #print("AttributeError!- WAIT A MOMENT!") +- time.sleep(0.3) +- print("GOING ON ..") +- demo.ckearCanvas() +- except TypeError: +- demo.screen._delete("all") +- #print("CRASH!!!- WAIT A MOMENT!") +- time.sleep(0.3) +- #print("GOING ON ..") +- demo.clearCanvas() +- except: +- print("BYE!") +- RUN = False ++ main() +diff -r c0e311e010fc Lib/turtledemo/clock.py +--- a/Lib/turtledemo/clock.py ++++ b/Lib/turtledemo/clock.py +@@ -13,8 +13,6 @@ + from turtle import * + from datetime import datetime + +-mode("logo") +- + def jump(distanz, winkel=0): + penup() + right(winkel) +@@ -42,7 +40,6 @@ + hand_form = get_poly() + register_shape(name, hand_form) + +- + def clockface(radius): + reset() + pensize(7) +@@ -83,7 +80,6 @@ + writer.pu() + writer.bk(85) + +- + def wochentag(t): + wochentag = ["Monday", "Tuesday", "Wednesday", + "Thursday", "Friday", "Saturday", "Sunday"] +@@ -102,22 +98,25 @@ + sekunde = t.second + t.microsecond*0.000001 + minute = t.minute + sekunde/60.0 + stunde = t.hour + minute/60.0 +- tracer(False) +- writer.clear() +- writer.home() +- writer.forward(65) +- writer.write(wochentag(t), +- align="center", font=("Courier", 14, "bold")) +- writer.back(150) +- writer.write(datum(t), +- align="center", font=("Courier", 14, "bold")) +- writer.forward(85) +- tracer(True) +- second_hand.setheading(6*sekunde) +- minute_hand.setheading(6*minute) +- hour_hand.setheading(30*stunde) +- tracer(True) +- ontimer(tick, 100) ++ try: ++ tracer(False) # Terminator can occur here ++ writer.clear() ++ writer.home() ++ writer.forward(65) ++ writer.write(wochentag(t), ++ align="center", font=("Courier", 14, "bold")) ++ writer.back(150) ++ writer.write(datum(t), ++ align="center", font=("Courier", 14, "bold")) ++ writer.forward(85) ++ tracer(True) ++ second_hand.setheading(6*sekunde) # or here ++ minute_hand.setheading(6*minute) ++ hour_hand.setheading(30*stunde) ++ tracer(True) ++ ontimer(tick, 100) ++ except Terminator: ++ pass # turtledemo user pressed STOP + + def main(): + tracer(False) +@@ -127,6 +126,7 @@ + return "EVENTLOOP" + + if __name__ == "__main__": ++ mode("logo") + msg = main() + print(msg) + mainloop() +diff -r c0e311e010fc Lib/turtledemo/colormixer.py +--- a/Lib/turtledemo/colormixer.py ++++ b/Lib/turtledemo/colormixer.py +@@ -1,8 +1,6 @@ + # colormixer + + from turtle import Screen, Turtle, mainloop +-import sys +-sys.setrecursionlimit(20000) # overcomes, for now, an instability of Python 3.0 + + class ColorTurtle(Turtle): + +diff -r c0e311e010fc Lib/turtledemo/demohelp.txt +--- a/Lib/turtledemo/demohelp.txt ++++ b/Lib/turtledemo/demohelp.txt +@@ -53,18 +53,31 @@ + + (2) How to add your own demos to the demo repository + +- - place: same directory as turtledemo/__main__.py ++ - Place the file in the same directory as turtledemo/__main__.py ++ IMPORTANT! When imported, the demo should not modify the system ++ by calling functions in other modules, such as sys, tkinter, or ++ turtle. Global variables should be initialized in main(). + +- - requirements on source code: +- code must contain a main() function which will +- be executed by the viewer (see provided example scripts) +- main() may return a string which will be displayed +- in the Label below the source code window (when execution +- has finished.) ++ - The code must contain a main() function which will ++ be executed by the viewer (see provided example scripts). ++ It may return a string which will be displayed in the Label below ++ the source code window (when execution has finished.) + +- !! For programs, which are EVENT DRIVEN, main must return +- !! the string "EVENTLOOP". This informs the viewer, that the +- !! script is still running and must be stopped by the user! ++ - In order to run mydemo.py by itself, such as during development, ++ add the following at the end of the file: + +- +- ++ if __name__ == '__main__': ++ main() ++ mainloop() # keep window open ++ ++ python -m turtledemo.mydemo # will then run it ++ ++ - If the demo is EVENT DRIVEN, main must return the string ++ "EVENTLOOP". This informs the demo viewer that the script is ++ still running and must be stopped by the user! ++ ++ If an "EVENTLOOP" demo runs by itself, as with clock, which uses ++ ontimer, or minimal_hanoi, which loops by recursion, then the ++ code should catch the turtle.Terminator exception that will be ++ raised when the user presses the STOP button. (Paint is not such ++ a demo; it only acts in response to mouse clicks and movements.) +diff -r c0e311e010fc Lib/turtledemo/forest.py +--- a/Lib/turtledemo/forest.py ++++ b/Lib/turtledemo/forest.py +@@ -3,12 +3,12 @@ + + tdemo_forest.py + +-Displays a 'forest' of 3 'breadth-first-trees' +-similar to the one from example tree. +-For further remarks see xtx_tree.py ++Displays a 'forest' of 3 breadth-first-trees ++similar to the one in tree. ++For further remarks see tree.py + + This example is a 'breadth-first'-rewrite of +-a Logo program written by Erich Neuwirth. See: ++a Logo program written by Erich Neuwirth. See + http://homepage.univie.ac.at/erich.neuwirth/ + """ + from turtle import Turtle, colormode, tracer, mainloop +@@ -104,6 +104,5 @@ + return "runtime: %.2f sec." % (b-a) + + if __name__ == '__main__': +- msg = main() +- print(msg) ++ main() + mainloop() +diff -r c0e311e010fc Lib/turtledemo/minimal_hanoi.py +--- a/Lib/turtledemo/minimal_hanoi.py ++++ b/Lib/turtledemo/minimal_hanoi.py +@@ -50,9 +50,12 @@ + def play(): + onkey(None,"space") + clear() +- hanoi(6, t1, t2, t3) +- write("press STOP button to exit", +- align="center", font=("Courier", 16, "bold")) ++ try: ++ hanoi(6, t1, t2, t3) ++ write("press STOP button to exit", ++ align="center", font=("Courier", 16, "bold")) ++ except Terminator: ++ pass # turtledemo user pressed STOP + + def main(): + global t1, t2, t3 +diff -r c0e311e010fc Lib/turtledemo/nim.py +--- a/Lib/turtledemo/nim.py ++++ b/Lib/turtledemo/nim.py +@@ -143,7 +143,6 @@ + self.writer.write(msg1, align="center", font=("Courier",14,"bold")) + self.screen.tracer(True) + +- + def setup(self): + self.screen.tracer(False) + for row in range(3): +@@ -181,6 +180,7 @@ + if self.game.state == Nim.OVER: + self.screen.clear() + ++ + class NimController(object): + + def __init__(self, game): +@@ -201,6 +201,7 @@ + self.game.model.notify_move(row, col) + self.BUSY = False + ++ + class Nim(object): + CREATED = 0 + RUNNING = 1 +@@ -213,11 +214,10 @@ + self.controller = NimController(self) + + +-mainscreen = turtle.Screen() +-mainscreen.mode("standard") +-mainscreen.setup(SCREENWIDTH, SCREENHEIGHT) +- + def main(): ++ mainscreen = turtle.Screen() ++ mainscreen.mode("standard") ++ mainscreen.setup(SCREENWIDTH, SCREENHEIGHT) + nim = Nim(mainscreen) + return "EVENTLOOP!" + +diff -r c0e311e010fc Lib/turtledemo/paint.py +--- a/Lib/turtledemo/paint.py ++++ b/Lib/turtledemo/paint.py +@@ -3,11 +3,15 @@ + + tdemo_paint.py + +-A simple eventdriven paint program ++A simple event-driven paint program + +-- use left mouse button to move turtle +-- middle mouse button to change color +-- right mouse button do turn filling on/off ++- left mouse button moves turtle ++- middle mouse button changes color ++- right mouse button toogles betweem pen up ++(no line drawn when the turtle moves) and ++pen down (line is drawn). If pen up follows ++at least two pen-down moves, the polygon that ++includes the starting point is filled. + ------------------------------------------- + Play around by clicking into the canvas + using all three mouse buttons. +diff -r c0e311e010fc Lib/turtledemo/peace.py +--- a/Lib/turtledemo/peace.py ++++ b/Lib/turtledemo/peace.py +@@ -3,14 +3,10 @@ + + tdemo_peace.py + +-A very simple drawing suitable as a beginner's +-programming example. +- +-Uses only commands, which are also available in +-old turtle.py. +- +-Intentionally no variables are used except for the +-colorloop: ++A simple drawing suitable as a beginner's ++programming example. Aside from the ++peacecolors assignment and the for loop, ++it only uses turtle commands. + """ + + from turtle import * +@@ -21,7 +17,7 @@ + "royalblue1", "dodgerblue4") + + reset() +- s = Screen() ++ Screen() + up() + goto(-320,-195) + width(70) +@@ -58,7 +54,7 @@ + up() + + goto(0,300) # vanish if hideturtle() is not available ;-) +- return "Done!!" ++ return "Done!" + + if __name__ == "__main__": + main() +diff -r c0e311e010fc Lib/turtledemo/planet_and_moon.py +--- a/Lib/turtledemo/planet_and_moon.py ++++ b/Lib/turtledemo/planet_and_moon.py +@@ -12,9 +12,9 @@ + Planet has a circular orbit, moon a stable + orbit around the planet. + +-You can hold the movement temporarily by pressing +-the left mouse button with mouse over the +-scrollbar of the canvas. ++You can hold the movement temporarily by ++pressing the left mouse button with the ++mouse over the scrollbar of the canvas. + + """ + from turtle import Shape, Turtle, mainloop, Vec2D as Vec +@@ -108,6 +108,5 @@ + return "Done!" + + if __name__ == '__main__': +- msg = main() +- print(msg) +- #mainloop() ++ main() ++ mainloop() +diff -r c0e311e010fc Lib/turtledemo/tree.py +--- a/Lib/turtledemo/tree.py ++++ b/Lib/turtledemo/tree.py +@@ -11,9 +11,9 @@ + (1) a tree-generator, where the drawing is + quasi the side-effect, whereas the generator + always yields None. +-(2) Turtle-cloning: At each branching point the +-current pen is cloned. So in the end there +-are 1024 turtles. ++(2) Turtle-cloning: At each branching point ++the current pen is cloned. So in the end ++there are 1024 turtles. + """ + from turtle import Turtle, mainloop + from time import clock +diff -r c0e311e010fc Lib/turtledemo/two_canvases.py +--- a/Lib/turtledemo/two_canvases.py ++++ b/Lib/turtledemo/two_canvases.py +@@ -1,52 +1,54 @@ +-#!/usr/bin/env python3 +-## DEMONSTRATES USE OF 2 CANVASES, SO CANNOT BE RUN IN DEMOVIEWER! +-"""turtle example: Using TurtleScreen and RawTurtle +-for drawing on two distinct canvases. ++"""turtledemo.two_canvases ++ ++Use TurtleScreen and RawTurtle to draw on two ++distinct canvases in a separate windows. The ++new window must be separately closed in ++addition to pressing the STOP button. + """ ++ + from turtle import TurtleScreen, RawTurtle, TK + +-root = TK.Tk() +-cv1 = TK.Canvas(root, width=300, height=200, bg="#ddffff") +-cv2 = TK.Canvas(root, width=300, height=200, bg="#ffeeee") +-cv1.pack() +-cv2.pack() ++def main(): ++ root = TK.Tk() ++ cv1 = TK.Canvas(root, width=300, height=200, bg="#ddffff") ++ cv2 = TK.Canvas(root, width=300, height=200, bg="#ffeeee") ++ cv1.pack() ++ cv2.pack() + +-s1 = TurtleScreen(cv1) +-s1.bgcolor(0.85, 0.85, 1) +-s2 = TurtleScreen(cv2) +-s2.bgcolor(1, 0.85, 0.85) ++ s1 = TurtleScreen(cv1) ++ s1.bgcolor(0.85, 0.85, 1) ++ s2 = TurtleScreen(cv2) ++ s2.bgcolor(1, 0.85, 0.85) + +-p = RawTurtle(s1) +-q = RawTurtle(s2) ++ p = RawTurtle(s1) ++ q = RawTurtle(s2) + +-p.color("red", (1, 0.85, 0.85)) +-p.width(3) +-q.color("blue", (0.85, 0.85, 1)) +-q.width(3) ++ p.color("red", (1, 0.85, 0.85)) ++ p.width(3) ++ q.color("blue", (0.85, 0.85, 1)) ++ q.width(3) + +-for t in p,q: +- t.shape("turtle") +- t.lt(36) ++ for t in p,q: ++ t.shape("turtle") ++ t.lt(36) + +-q.lt(180) ++ q.lt(180) + +-for t in p, q: +- t.begin_fill() +-for i in range(5): + for t in p, q: +- t.fd(50) +- t.lt(72) +-for t in p,q: +- t.end_fill() +- t.lt(54) +- t.pu() +- t.bk(50) ++ t.begin_fill() ++ for i in range(5): ++ for t in p, q: ++ t.fd(50) ++ t.lt(72) ++ for t in p,q: ++ t.end_fill() ++ t.lt(54) ++ t.pu() ++ t.bk(50) + +-## Want to get some info? ++ return "EVENTLOOP" + +-#print(s1, s2) +-#print(p, q) +-#print(s1.turtles()) +-#print(s2.turtles()) + +-TK.mainloop() ++if __name__ == '__main__': ++ main() ++ TK.mainloop() # keep window open until user closes it +diff -r c0e311e010fc Lib/urllib/request.py +--- a/Lib/urllib/request.py ++++ b/Lib/urllib/request.py +@@ -1315,7 +1315,7 @@ + url = req.selector + if url[:2] == '//' and url[2:3] != '/' and (req.host and + req.host != 'localhost'): +- if not req.host is self.get_names(): ++ if not req.host in self.get_names(): + raise URLError("file:// scheme is supported only on localhost") + else: + return self.open_local_file(req) +@@ -1911,7 +1911,7 @@ + # XXX thread unsafe! + if len(self.ftpcache) > MAXFTPCACHE: + # Prune the cache, rather arbitrarily +- for k in self.ftpcache.keys(): ++ for k in list(self.ftpcache): + if k != key: + v = self.ftpcache[k] + del self.ftpcache[k] +diff -r c0e311e010fc Lib/venv/__init__.py +--- a/Lib/venv/__init__.py ++++ b/Lib/venv/__init__.py +@@ -30,7 +30,6 @@ + import logging + import os + import shutil +-import struct + import subprocess + import sys + import types +@@ -140,11 +139,12 @@ + create_if_needed(path) + create_if_needed(libpath) + # Issue 21197: create lib64 as a symlink to lib on 64-bit non-OS X POSIX +- if ((struct.calcsize('P') == 8) and (os.name == 'posix') and ++ if ((sys.maxsize > 2**32) and (os.name == 'posix') and + (sys.platform != 'darwin')): + p = os.path.join(env_dir, 'lib') + link_path = os.path.join(env_dir, 'lib64') +- os.symlink(p, link_path) ++ if not os.path.exists(link_path): # Issue #21643 ++ os.symlink(p, link_path) + context.bin_path = binpath = os.path.join(env_dir, binname) + context.bin_name = binname + context.env_exe = os.path.join(binpath, exename) +@@ -212,7 +212,11 @@ + for suffix in ('python', 'python3'): + path = os.path.join(binpath, suffix) + if not os.path.exists(path): +- os.symlink(exename, path) ++ # Issue 18807: make copies if ++ # symlinks are not wanted ++ copier(context.env_exe, path) ++ if not os.path.islink(path): ++ os.chmod(path, 0o755) + else: + subdir = 'DLLs' + include = self.include_binary +@@ -234,7 +238,8 @@ + if 'init.tcl' in files: + tcldir = os.path.basename(root) + tcldir = os.path.join(context.env_dir, 'Lib', tcldir) +- os.makedirs(tcldir) ++ if not os.path.exists(tcldir): ++ os.makedirs(tcldir) + src = os.path.join(root, 'init.tcl') + dst = os.path.join(tcldir, 'init.tcl') + shutil.copyfile(src, dst) +diff -r c0e311e010fc Lib/xml/dom/minidom.py +--- a/Lib/xml/dom/minidom.py ++++ b/Lib/xml/dom/minidom.py +@@ -976,7 +976,7 @@ + def _get_nodeValue(self): + return self.data + def _set_nodeValue(self, value): +- self.data = data ++ self.data = value + nodeValue = property(_get_nodeValue, _set_nodeValue) + + # nodeName is an alias for target +diff -r c0e311e010fc Lib/zipfile.py +--- a/Lib/zipfile.py ++++ b/Lib/zipfile.py +@@ -411,7 +411,7 @@ + # Try to decode the extra field. + extra = self.extra + unpack = struct.unpack +- while extra: ++ while len(extra) >= 4: + tp, ln = unpack('= 24: +diff -r c0e311e010fc Mac/BuildScript/build-installer.py +--- a/Mac/BuildScript/build-installer.py ++++ b/Mac/BuildScript/build-installer.py +@@ -150,17 +150,19 @@ + # $MACOSX_DEPLOYMENT_TARGET -> minimum OS X level + DEPTARGET = '10.3' + +-target_cc_map = { ++def getDeptargetTuple(): ++ return tuple([int(n) for n in DEPTARGET.split('.')[0:2]]) ++ ++def getTargetCompilers(): ++ target_cc_map = { + '10.3': ('gcc-4.0', 'g++-4.0'), + '10.4': ('gcc-4.0', 'g++-4.0'), + '10.5': ('gcc-4.2', 'g++-4.2'), + '10.6': ('gcc-4.2', 'g++-4.2'), +- '10.7': ('clang', 'clang++'), +- '10.8': ('clang', 'clang++'), +- '10.9': ('clang', 'clang++'), +-} ++ } ++ return target_cc_map.get(DEPTARGET, ('clang', 'clang++') ) + +-CC, CXX = target_cc_map[DEPTARGET] ++CC, CXX = getTargetCompilers() + + PYTHON_3 = getVersionTuple() >= (3, 0) + +@@ -193,10 +195,10 @@ + def library_recipes(): + result = [] + +- LT_10_5 = bool(DEPTARGET < '10.5') ++ LT_10_5 = bool(getDeptargetTuple() < (10, 5)) + + # Disable for now +- if False: # if (DEPTARGET > '10.5') and (getVersionTuple() >= (3, 5)): ++ if False: # if (getDeptargetTuple() > (10, 5)) and (getVersionTuple() >= (3, 5)): + result.extend([ + dict( + name="Tcl 8.5.15", +@@ -304,7 +306,7 @@ + ), + ]) + +- if DEPTARGET < '10.5': ++ if getDeptargetTuple() < (10, 5): + result.extend([ + dict( + name="Bzip2 1.0.6", +@@ -458,7 +460,7 @@ + ) + ) + +- if DEPTARGET < '10.4' and not PYTHON_3: ++ if getDeptargetTuple() < (10, 4) and not PYTHON_3: + result.append( + dict( + name="PythonSystemFixes", +@@ -679,7 +681,7 @@ + SDKPATH=os.path.abspath(SDKPATH) + DEPSRC=os.path.abspath(DEPSRC) + +- CC, CXX=target_cc_map[DEPTARGET] ++ CC, CXX = getTargetCompilers() + + print("Settings:") + print(" * Source directory:", SRCDIR) +@@ -985,6 +987,9 @@ + shellQuote(WORKDIR)[1:-1], + shellQuote(WORKDIR)[1:-1])) + ++ print("Running make touch") ++ runCommand("make touch") ++ + print("Running make") + runCommand("make") + +diff -r c0e311e010fc Misc/ACKS +--- a/Misc/ACKS ++++ b/Misc/ACKS +@@ -29,9 +29,11 @@ + Jyrki Alakuijala + Steve Alexander + Fred Allen ++Jeff Allen + Ray Allen + Billy G. Allie + Kevin Altis ++Skyler Leigh Amador + Joe Amenta + A. Amoroso + Mark Anacker +@@ -172,6 +174,7 @@ + Tobias Brink + Richard Brodie + Michael Broghton ++Ammar Brohi + Daniel Brotsky + Jean Brouwers + Gary S. Brown +@@ -195,7 +198,9 @@ + Alastair Burt + Tarn Weisner Burton + Lee Busby ++Katherine Busch + Ralph Butler ++Zach Byrne + Nicolas Cadou + Jp Calderone + Arnaud Calmettes +@@ -232,8 +237,10 @@ + Albert Chin-A-Young + Adal Chiriliuc + Matt Chisholm ++Lita Cho + Anders Chrigström + Tom Christiansen ++Renee Chu + Vadim Chugunov + Mauro Cicognini + David Cinege +@@ -305,6 +312,7 @@ + Arnaud Delobelle + Konrad Delong + Erik Demaine ++Martin Dengler + John Dennis + L. Peter Deutsch + Roger Dev +@@ -402,6 +410,7 @@ + Dan Finnie + Nils Fischbeck + Frederik Fix ++Tom Flanagan + Matt Fleming + Hernán Martínez Foffani + Artem Fokin +@@ -409,6 +418,7 @@ + Michael Foord + Amaury Forgeot d'Arc + Doug Fort ++Chris Foster + John Fouhy + Andrew Francis + Stefan Franke +@@ -507,6 +517,7 @@ + Lynda Hardman + Derek Harland + Jason Harper ++David Harrigan + Brian Harring + Jonathan Hartley + Travis B. Hartwell +@@ -652,6 +663,7 @@ + Tamito Kajiyama + Jan Kaliszewski + Peter van Kampen ++Jan Kanis + Rafe Kaplan + Jacob Kaplan-Moss + Janne Karila +@@ -746,6 +758,7 @@ + Chris Lawrence + Brian Leair + Mathieu Leduc-Hamel ++Amandine Lee + Antony Lee + Christopher Lee + Inyeol Lee +@@ -817,6 +830,7 @@ + Nick Maclaren + Don MacMillen + Tomasz Maćkowiak ++Wolfgang Maier + Steve Majewski + Marek Majkowski + Grzegorz Makarewicz +@@ -969,6 +983,7 @@ + Tomas Oppelstrup + Jason Orendorff + Douglas Orr ++William Orr + Michele Orrù + Oleg Oshmyan + Denis S. Otkidach +diff -r c0e311e010fc Misc/NEWS +--- a/Misc/NEWS ++++ b/Misc/NEWS +@@ -2,6 +2,279 @@ + Python News + +++++++++++ + ++What's New in Python 3.4.2? ++=========================== ++ ++Release date: XXXX-XX-XX ++ ++Core and Builtins ++----------------- ++ ++- Issue #21669: With the aid of heuristics in SyntaxError.__init__, the ++ parser now attempts to generate more meaningful (or at least more search ++ engine friendly) error messages when "exec" and "print" are used as ++ statements. ++ ++- Issue #21642: If the conditional if-else expression, allow an integer written ++ with no space between itself and the ``else`` keyword (e.g. ``True if 42else ++ False``) to be valid syntax. ++ ++- Issue #21523: Fix over-pessimistic computation of the stack effect of ++ some opcodes in the compiler. This also fixes a quadratic compilation ++ time issue noticeable when compiling code with a large number of "and" ++ and "or" operators. ++ ++Library ++------- ++ ++- Issue #16133: The asynchat.async_chat.handle_read() method now ignores ++ BlockingIOError exceptions. ++ ++- Issue #22044: Fixed premature DECREF in call_tzinfo_method. ++ Patch by Tom Flanagan. ++ ++- Issue #19884: readline: Disable the meta modifier key if stdout is not ++ a terminal to not write the ANSI sequence "\033[1034h" into stdout. This ++ sequence is used on some terminal (ex: TERM=xterm-256color") to enable ++ support of 8 bit characters. ++ ++- Issue #21888: plistlib's load() and loads() now work if the fmt parameter is ++ specified. ++ ++- Issue #21044: tarfile.open() now handles fileobj with an integer 'name' ++ attribute. Based on patch by Antoine Pietri. ++ ++- Issue #21867: Prevent turtle crash due to invalid undo buffer size. ++ ++- Issue #19076: Don't pass the redundant 'file' argument to self.error(). ++ ++- Issue #21942: Fixed source file viewing in pydoc's server mode on Windows. ++ ++- Issue #11259: asynchat.async_chat().set_terminator() now raises a ValueError ++ if the number of received bytes is negative. ++ ++- Issue #12523: asynchat.async_chat.push() now raises a TypeError if it doesn't ++ get a bytes string ++ ++- Issue #21707: Add missing kwonlyargcount argument to ++ ModuleFinder.replace_paths_in_code(). ++ ++- Issue #20639: calling Path.with_suffix('') allows removing the suffix ++ again. Patch by July Tikhonov. ++ ++- Issue #21714: Disallow the construction of invalid paths using ++ Path.with_name(). Original patch by Antony Lee. ++ ++- Issue #21897: Fix a crash with the f_locals attribute with closure ++ variables when frame.clear() has been called. ++ ++- Issue #21151: Fixed a segfault in the winreg module when ``None`` is passed ++ as a ``REG_BINARY`` value to SetValueEx. Patch by John Ehresman. ++ ++- Issue #21090: io.FileIO.readall() does not ignore I/O errors anymore. Before, ++ it ignored I/O errors if at least the first C call read() succeed. ++ ++- Issue #21781: ssl.RAND_add() now supports strings longer than 2 GB. ++ ++- Issue #11453: asyncore: emit a ResourceWarning when an unclosed file_wrapper ++ object is destroyed. The destructor now closes the file if needed. The ++ close() method can now be called twice: the second call does nothing. ++ ++- Issue #21858: Better handling of Python exceptions in the sqlite3 module. ++ ++- Issue #21476: Make sure the email.parser.BytesParser TextIOWrapper is ++ discarded after parsing, so the input file isn't unexpectedly closed. ++ ++- Issue #21729: Used the "with" statement in the dbm.dumb module to ensure ++ files closing. Patch by Claudiu Popa. ++ ++- Issue #21491: socketserver: Fix a race condition in child processes reaping. ++ ++- Issue #21832: Require named tuple inputs to be exact strings. ++ ++- Issue #19145: The times argument for itertools.repeat now handles ++ negative values the same way for keyword arguments as it does for ++ positional arguments. ++ ++- Issue #21812: turtle.shapetransform did not tranform the turtle on the ++ first call. (Issue identified and fixed by Lita Cho.) ++ ++- Issue #21635: The difflib SequenceMatcher.get_matching_blocks() method ++ cache didn't match the actual result. The former was a list of tuples ++ and the latter was a list of named tuples. ++ ++- Issue #21722: The distutils "upload" command now exits with a non-zero ++ return code when uploading fails. Patch by Martin Dengler. ++ ++- Issue #21723: asyncio.Queue: support any type of number (ex: float) for the ++ maximum size. Patch written by Vajrasky Kok. ++ ++- Issue #21326: Add a new is_closed() method to asyncio.BaseEventLoop. ++ run_forever() and run_until_complete() methods of asyncio.BaseEventLoop now ++ raise an exception if the event loop was closed. ++ ++- Issue #21774: Fixed NameError for an incorrect variable reference in the ++ XML Minidom code for creating processing instructions. ++ (Found and fixed by Claudiu Popa.) ++ ++- Issue #21766: Prevent a security hole in CGIHTTPServer by URL unquoting paths ++ before checking for a CGI script at that path. ++ ++- Issue #21310: Fixed possible resource leak in failed open(). ++ ++- Issue #21677: Fixed chaining nonnormalized exceptions in io close() methods. ++ ++- Issue #11709: Fix the pydoc.help function to not fail when sys.stdin is not a ++ valid file. ++ ++- Issue #13223: Fix pydoc.writedoc so that the HTML documentation for methods ++ that use 'self' in the example code is generated correctly. ++ ++- Issue #21463: In urllib.request, fix pruning of the FTP cache. ++ ++- Issue #21618: The subprocess module could fail to close open fds that were ++ inherited by the calling process and already higher than POSIX resource ++ limits would otherwise allow. On systems with a functioning /proc/self/fd ++ or /dev/fd interface the max is now ignored and all fds are closed. ++ ++- Issue #21552: Fixed possible integer overflow of too long string lengths in ++ the tkinter module on 64-bit platforms. ++ ++- Issue #14315: The zipfile module now ignores extra fields in the central ++ directory that are too short to be parsed instead of letting a struct.unpack ++ error bubble up as this "bad data" appears in many real world zip files in ++ the wild and is ignored by other zip tools. ++ ++- Issue #21402: tkinter.ttk now works when default root window is not set. ++ ++- Issue #10203: sqlite3.Row now truly supports sequence protocol. In particulr ++ it supports reverse() and negative indices. Original patch by Claudiu Popa. ++ ++- Issue #18807: If copying (no symlinks) specified for a venv, then the python ++ interpreter aliases (python, python3) are now created by copying rather than ++ symlinking. ++ ++- Issue #14710: pkgutil.get_loader() no longer raises an exception when None is ++ found in sys.modules. ++ ++- Issue #14710: pkgutil.find_loader() no longer raises an exception when a ++ module doesn't exist. ++ ++- Issue #21481: Argparse equality and inequality tests now return ++ NotImplemented when comparing to an unknown type. ++ ++- Issue #8743: Fix interoperability between set objects and the ++ collections.Set() abstract base class. ++ ++- Issue #13355: random.triangular() no longer fails with a ZeroDivisionError ++ when low equals high. ++ ++- Issue #21538: The plistlib module now supports loading of binary plist files ++ when reference or offset size is not a power of two. ++ ++- Issue #21801: Validate that __signature__ is None or an instance of Signature. ++ ++- Issue #21923: Prevent AttributeError in distutils.sysconfig.customize_compiler ++ due to possible uninitialized _config_vars. ++ ++- Issue #21323: Fix http.server to again handle scripts in CGI subdirectories, ++ broken by the fix for security issue #19435. Patch by Zach Byrne. ++ ++Build ++----- ++ ++- Issue #21958: Define HAVE_ROUND when building with Visual Studio 2013 and ++ above. Patch by Zachary Turner. ++ ++- Issue #15759: "make suspicious", "make linkcheck" and "make doctest" in Doc/ ++ now display special message when and only when there are failures. ++ ++- Issue #17095: Fix Modules/Setup *shared* support. ++ ++- Issue #21811: Anticipated fixes to support OS X versions > 10.9. ++ ++IDLE ++---- ++ ++- Issue #21765: Add support for non-ascii identifiers to HyperParser. ++ ++- Issue #21940: Add unittest for WidgetRedirector. Initial patch by Saimadhav ++ Heblikar. ++ ++- Issue #18592: Add unittest for SearchDialogBase. Patch by Phil Webster. ++ ++- Issue #21694: Add unittest for ParenMatch. Patch by Saimadhav Heblikar. ++ ++- Issue #21686: add unittest for HyperParser. Original patch by Saimadhav ++ Heblikar. ++ ++- Issue #12387: Add missing upper(lower)case versions of default Windows key ++ bindings for Idle so Caps Lock does not disable them. Patch by Roger Serwy. ++ ++- Issue #21695: Closing a Find-in-files output window while the search is ++ still in progress no longer closes Idle. ++ ++- Issue #18910: Add unittest for textView. Patch by Phil Webster. ++ ++- Issue #18292: Add unittest for AutoExpand. Patch by Saihadhav Heblikar. ++ ++- Issue #18409: Add unittest for AutoComplete. Patch by Phil Webster. ++ ++Tests ++----- ++ ++- Issue #22002: Added ``load_package_tests`` function to test.support and used ++ it to implement/augment test discovery in test_asyncio, test_email, ++ test_importlib, test_json, and test_tools. ++ ++- Issue #21976: Fix test_ssl to accept LibreSSL version strings. Thanks ++ to William Orr. ++ ++- Issue #21918: Converted test_tools from a module to a package containing ++ separate test files for each tested script. ++ ++- Issue #20155: Changed HTTP method names in failing tests in test_httpservers ++ so that packet filtering software (specifically Windows Base Filtering Engine) ++ does not interfere with the transaction semantics expected by the tests. ++ ++- Issue #19493: Refactored the ctypes test package to skip tests explicitly ++ rather than silently. ++ ++- Issue #18492: All resources are now allowed when tests are not run by ++ regrtest.py. ++ ++- Issue #21634: Fix pystone micro-benchmark: use floor division instead of true ++ division to benchmark integers instead of floating point numbers. Set pystone ++ version to 1.2. Patch written by Lennart Regebro. ++ ++- Issue #21605: Added tests for Tkinter images. ++ ++- Issue #21493: Added test for ntpath.expanduser(). Original patch by ++ Claudiu Popa. ++ ++- Issue #19925: Added tests for the spwd module. Original patch by Vajrasky Kok. ++ ++- Issue #21522: Added Tkinter tests for Listbox.itemconfigure(), ++ PanedWindow.paneconfigure(), and Menu.entryconfigure(). ++ ++Windows ++------- ++ ++- Issue #21671, CVE-2014-0224: The bundled version of OpenSSL has been ++ updated to 1.0.1h. ++ ++- Issue #10747: Use versioned labels in the Windows start menu. ++ Patch by Olive Kilburn. ++ ++Tools/Demos ++----------- ++ ++- Issue #21906: Make Tools/scripts/md5sum.py work in Python 3. ++ Patch by Zachary Ware. ++ ++- Issue #21629: Fix Argument Clinic's "--converters" feature. ++ ++ + What's New in Python 3.4.1? + =========================== + +diff -r c0e311e010fc Modules/_datetimemodule.c +--- a/Modules/_datetimemodule.c ++++ b/Modules/_datetimemodule.c +@@ -897,11 +897,11 @@ + } + } + else { +- Py_DECREF(offset); + PyErr_Format(PyExc_TypeError, + "tzinfo.%s() must return None or " + "timedelta, not '%.200s'", + name, Py_TYPE(offset)->tp_name); ++ Py_DECREF(offset); + return NULL; + } + +@@ -2153,7 +2153,7 @@ + * is odd. Note that x is odd when it's last bit is 1. The + * code below uses bitwise and operation to check the last + * bit. */ +- temp = PyNumber_And(x, one); /* temp <- x & 1 */ ++ temp = PyNumber_And(x, one); /* temp <- x & 1 */ + if (temp == NULL) { + Py_DECREF(x); + goto Done; +@@ -3224,10 +3224,10 @@ + if (op != Py_EQ && op != Py_NE) + Py_RETURN_NOTIMPLEMENTED; + if (Py_TYPE(other) != &PyDateTime_TimeZoneType) { +- if (op == Py_EQ) +- Py_RETURN_FALSE; +- else +- Py_RETURN_TRUE; ++ if (op == Py_EQ) ++ Py_RETURN_FALSE; ++ else ++ Py_RETURN_TRUE; + } + return delta_richcompare(self->offset, other->offset, op); + } +@@ -4814,7 +4814,7 @@ + static char *keywords[] = {"tz", NULL}; + + if (! PyArg_ParseTupleAndKeywords(args, kw, "|O:astimezone", keywords, +- &tzinfo)) ++ &tzinfo)) + return NULL; + + if (check_tzinfo_subclass(tzinfo) == -1) +diff -r c0e311e010fc Modules/_io/_iomodule.c +--- a/Modules/_io/_iomodule.c ++++ b/Modules/_io/_iomodule.c +@@ -235,11 +235,12 @@ + char rawmode[6], *m; + int line_buffering, isatty; + +- PyObject *raw, *modeobj = NULL, *buffer = NULL, *wrapper = NULL; ++ PyObject *raw, *modeobj = NULL, *buffer, *wrapper, *result = NULL; + + _Py_IDENTIFIER(isatty); + _Py_IDENTIFIER(fileno); + _Py_IDENTIFIER(mode); ++ _Py_IDENTIFIER(close); + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|sizzziO:open", kwlist, + &file, &mode, &buffering, +@@ -354,6 +355,7 @@ + "OsiO", file, rawmode, closefd, opener); + if (raw == NULL) + return NULL; ++ result = raw; + + modeobj = PyUnicode_FromString(mode); + if (modeobj == NULL) +@@ -412,7 +414,7 @@ + } + + Py_DECREF(modeobj); +- return raw; ++ return result; + } + + /* wraps into a buffered file */ +@@ -433,15 +435,16 @@ + + buffer = PyObject_CallFunction(Buffered_class, "Oi", raw, buffering); + } +- Py_CLEAR(raw); + if (buffer == NULL) + goto error; ++ result = buffer; ++ Py_DECREF(raw); + + + /* if binary, returns the buffered file */ + if (binary) { + Py_DECREF(modeobj); +- return buffer; ++ return result; + } + + /* wraps into a TextIOWrapper */ +@@ -450,20 +453,37 @@ + buffer, + encoding, errors, newline, + line_buffering); +- Py_CLEAR(buffer); + if (wrapper == NULL) + goto error; ++ result = wrapper; ++ Py_DECREF(buffer); + + if (_PyObject_SetAttrId(wrapper, &PyId_mode, modeobj) < 0) + goto error; + Py_DECREF(modeobj); +- return wrapper; ++ return result; + + error: +- Py_XDECREF(raw); ++ if (result != NULL) { ++ PyObject *exc, *val, *tb, *close_result; ++ PyErr_Fetch(&exc, &val, &tb); ++ close_result = _PyObject_CallMethodId(result, &PyId_close, NULL); ++ if (close_result != NULL) { ++ Py_DECREF(close_result); ++ PyErr_Restore(exc, val, tb); ++ } else { ++ PyObject *exc2, *val2, *tb2; ++ PyErr_Fetch(&exc2, &val2, &tb2); ++ PyErr_NormalizeException(&exc, &val, &tb); ++ Py_XDECREF(exc); ++ Py_XDECREF(tb); ++ PyErr_NormalizeException(&exc2, &val2, &tb2); ++ PyException_SetContext(val2, val); ++ PyErr_Restore(exc2, val2, tb2); ++ } ++ Py_DECREF(result); ++ } + Py_XDECREF(modeobj); +- Py_XDECREF(buffer); +- Py_XDECREF(wrapper); + return NULL; + } + +diff -r c0e311e010fc Modules/_io/bufferedio.c +--- a/Modules/_io/bufferedio.c ++++ b/Modules/_io/bufferedio.c +@@ -548,13 +548,14 @@ + PyErr_Restore(exc, val, tb); + } + else { +- PyObject *val2; ++ PyObject *exc2, *val2, *tb2; ++ PyErr_Fetch(&exc2, &val2, &tb2); ++ PyErr_NormalizeException(&exc, &val, &tb); + Py_DECREF(exc); + Py_XDECREF(tb); +- PyErr_Fetch(&exc, &val2, &tb); +- PyErr_NormalizeException(&exc, &val2, &tb); ++ PyErr_NormalizeException(&exc2, &val2, &tb2); + PyException_SetContext(val2, val); +- PyErr_Restore(exc, val2, tb); ++ PyErr_Restore(exc2, val2, tb2); + } + } + +diff -r c0e311e010fc Modules/_io/fileio.c +--- a/Modules/_io/fileio.c ++++ b/Modules/_io/fileio.c +@@ -691,9 +691,9 @@ + } + continue; + } +- if (bytes_read > 0) +- break; + if (errno == EAGAIN) { ++ if (bytes_read > 0) ++ break; + Py_DECREF(result); + Py_RETURN_NONE; + } +diff -r c0e311e010fc Modules/_io/textio.c +--- a/Modules/_io/textio.c ++++ b/Modules/_io/textio.c +@@ -2613,13 +2613,14 @@ + PyErr_Restore(exc, val, tb); + } + else { +- PyObject *val2; ++ PyObject *exc2, *val2, *tb2; ++ PyErr_Fetch(&exc2, &val2, &tb2); ++ PyErr_NormalizeException(&exc, &val, &tb); + Py_DECREF(exc); + Py_XDECREF(tb); +- PyErr_Fetch(&exc, &val2, &tb); +- PyErr_NormalizeException(&exc, &val2, &tb); ++ PyErr_NormalizeException(&exc2, &val2, &tb2); + PyException_SetContext(val2, val); +- PyErr_Restore(exc, val2, tb); ++ PyErr_Restore(exc2, val2, tb2); + } + } + return res; +diff -r c0e311e010fc Modules/_posixsubprocess.c +--- a/Modules/_posixsubprocess.c ++++ b/Modules/_posixsubprocess.c +@@ -44,10 +44,6 @@ + #define POSIX_CALL(call) do { if ((call) == -1) goto error; } while (0) + + +-/* Maximum file descriptor, initialized on module load. */ +-static long max_fd; +- +- + /* Given the gc module call gc.enable() and return 0 on success. */ + static int + _enable_gc(PyObject *gc_module) +@@ -166,14 +162,39 @@ + } + + +-/* Close all file descriptors in the range start_fd inclusive to +- * end_fd exclusive except for those in py_fds_to_keep. If the +- * range defined by [start_fd, end_fd) is large this will take a +- * long time as it calls close() on EVERY possible fd. ++/* Get the maximum file descriptor that could be opened by this process. ++ * This function is async signal safe for use between fork() and exec(). ++ */ ++static long ++safe_get_max_fd(void) ++{ ++ long local_max_fd; ++#if defined(__NetBSD__) ++ local_max_fd = fcntl(0, F_MAXFD); ++ if (local_max_fd >= 0) ++ return local_max_fd; ++#endif ++#ifdef _SC_OPEN_MAX ++ local_max_fd = sysconf(_SC_OPEN_MAX); ++ if (local_max_fd == -1) ++#endif ++ local_max_fd = 256; /* Matches legacy Lib/subprocess.py behavior. */ ++ return local_max_fd; ++} ++ ++ ++/* Close all file descriptors in the range from start_fd and higher ++ * except for those in py_fds_to_keep. If the range defined by ++ * [start_fd, safe_get_max_fd()) is large this will take a long ++ * time as it calls close() on EVERY possible fd. ++ * ++ * It isn't possible to know for sure what the max fd to go up to ++ * is for processes with the capability of raising their maximum. + */ + static void +-_close_fds_by_brute_force(int start_fd, int end_fd, PyObject *py_fds_to_keep) ++_close_fds_by_brute_force(long start_fd, PyObject *py_fds_to_keep) + { ++ long end_fd = safe_get_max_fd(); + Py_ssize_t num_fds_to_keep = PySequence_Length(py_fds_to_keep); + Py_ssize_t keep_seq_idx; + int fd_num; +@@ -213,8 +234,8 @@ + char d_name[256]; /* Filename (null-terminated) */ + }; + +-/* Close all open file descriptors in the range start_fd inclusive to end_fd +- * exclusive. Do not close any in the sorted py_fds_to_keep list. ++/* Close all open file descriptors in the range from start_fd and higher ++ * Do not close any in the sorted py_fds_to_keep list. + * + * This version is async signal safe as it does not make any unsafe C library + * calls, malloc calls or handle any locks. It is _unfortunate_ to be forced +@@ -229,16 +250,14 @@ + * it with some cpp #define magic to work on other OSes as well if you want. + */ + static void +-_close_open_fd_range_safe(int start_fd, int end_fd, PyObject* py_fds_to_keep) ++_close_open_fds_safe(int start_fd, PyObject* py_fds_to_keep) + { + int fd_dir_fd; +- if (start_fd >= end_fd) +- return; + + fd_dir_fd = _Py_open(FD_DIR, O_RDONLY); + if (fd_dir_fd == -1) { + /* No way to get a list of open fds. */ +- _close_fds_by_brute_force(start_fd, end_fd, py_fds_to_keep); ++ _close_fds_by_brute_force(start_fd, py_fds_to_keep); + return; + } else { + char buffer[sizeof(struct linux_dirent64)]; +@@ -253,23 +272,23 @@ + entry = (struct linux_dirent64 *)(buffer + offset); + if ((fd = _pos_int_from_ascii(entry->d_name)) < 0) + continue; /* Not a number. */ +- if (fd != fd_dir_fd && fd >= start_fd && fd < end_fd && ++ if (fd != fd_dir_fd && fd >= start_fd && + !_is_fd_in_sorted_fd_sequence(fd, py_fds_to_keep)) { + while (close(fd) < 0 && errno == EINTR); + } + } + } +- close(fd_dir_fd); ++ while (close(fd_dir_fd) < 0 && errno == EINTR); + } + } + +-#define _close_open_fd_range _close_open_fd_range_safe ++#define _close_open_fds _close_open_fds_safe + + #else /* NOT (defined(__linux__) && defined(HAVE_SYS_SYSCALL_H)) */ + + +-/* Close all open file descriptors in the range start_fd inclusive to end_fd +- * exclusive. Do not close any in the sorted py_fds_to_keep list. ++/* Close all open file descriptors from start_fd and higher. ++ * Do not close any in the sorted py_fds_to_keep list. + * + * This function violates the strict use of async signal safe functions. :( + * It calls opendir(), readdir() and closedir(). Of these, the one most +@@ -282,17 +301,13 @@ + * http://womble.decadent.org.uk/readdir_r-advisory.html + */ + static void +-_close_open_fd_range_maybe_unsafe(int start_fd, int end_fd, +- PyObject* py_fds_to_keep) ++_close_open_fds_maybe_unsafe(long start_fd, PyObject* py_fds_to_keep) + { + DIR *proc_fd_dir; + #ifndef HAVE_DIRFD +- while (_is_fd_in_sorted_fd_sequence(start_fd, py_fds_to_keep) && +- (start_fd < end_fd)) { ++ while (_is_fd_in_sorted_fd_sequence(start_fd, py_fds_to_keep)) { + ++start_fd; + } +- if (start_fd >= end_fd) +- return; + /* Close our lowest fd before we call opendir so that it is likely to + * reuse that fd otherwise we might close opendir's file descriptor in + * our loop. This trick assumes that fd's are allocated on a lowest +@@ -300,8 +315,6 @@ + while (close(start_fd) < 0 && errno == EINTR); + ++start_fd; + #endif +- if (start_fd >= end_fd) +- return; + + #if defined(__FreeBSD__) + if (!_is_fdescfs_mounted_on_dev_fd()) +@@ -311,7 +324,7 @@ + proc_fd_dir = opendir(FD_DIR); + if (!proc_fd_dir) { + /* No way to get a list of open fds. */ +- _close_fds_by_brute_force(start_fd, end_fd, py_fds_to_keep); ++ _close_fds_by_brute_force(start_fd, py_fds_to_keep); + } else { + struct dirent *dir_entry; + #ifdef HAVE_DIRFD +@@ -324,7 +337,7 @@ + int fd; + if ((fd = _pos_int_from_ascii(dir_entry->d_name)) < 0) + continue; /* Not a number. */ +- if (fd != fd_used_by_opendir && fd >= start_fd && fd < end_fd && ++ if (fd != fd_used_by_opendir && fd >= start_fd && + !_is_fd_in_sorted_fd_sequence(fd, py_fds_to_keep)) { + while (close(fd) < 0 && errno == EINTR); + } +@@ -332,13 +345,13 @@ + } + if (errno) { + /* readdir error, revert behavior. Highly Unlikely. */ +- _close_fds_by_brute_force(start_fd, end_fd, py_fds_to_keep); ++ _close_fds_by_brute_force(start_fd, py_fds_to_keep); + } + closedir(proc_fd_dir); + } + } + +-#define _close_open_fd_range _close_open_fd_range_maybe_unsafe ++#define _close_open_fds _close_open_fds_maybe_unsafe + + #endif /* else NOT (defined(__linux__) && defined(HAVE_SYS_SYSCALL_H)) */ + +@@ -457,14 +470,8 @@ + + /* close FDs after executing preexec_fn, which might open FDs */ + if (close_fds) { +- int local_max_fd = max_fd; +-#if defined(__NetBSD__) +- local_max_fd = fcntl(0, F_MAXFD); +- if (local_max_fd < 0) +- local_max_fd = max_fd; +-#endif + /* TODO HP-UX could use pstat_getproc() if anyone cares about it. */ +- _close_open_fd_range(3, local_max_fd, py_fds_to_keep); ++ _close_open_fds(3, py_fds_to_keep); + } + + /* This loop matches the Lib/os.py _execvpe()'s PATH search when */ +@@ -759,11 +766,5 @@ + PyMODINIT_FUNC + PyInit__posixsubprocess(void) + { +-#ifdef _SC_OPEN_MAX +- max_fd = sysconf(_SC_OPEN_MAX); +- if (max_fd == -1) +-#endif +- max_fd = 256; /* Matches Lib/subprocess.py */ +- + return PyModule_Create(&_posixsubprocessmodule); + } +diff -r c0e311e010fc Modules/_sqlite/cursor.c +--- a/Modules/_sqlite/cursor.c ++++ b/Modules/_sqlite/cursor.c +@@ -289,9 +289,8 @@ + Py_END_ALLOW_THREADS + + row = PyTuple_New(numcols); +- if (!row) { ++ if (!row) + return NULL; +- } + + for (i = 0; i < numcols; i++) { + if (self->connection->detect_types) { +@@ -311,14 +310,12 @@ + converted = Py_None; + } else { + item = PyBytes_FromStringAndSize(val_str, nbytes); +- if (!item) { +- return NULL; +- } ++ if (!item) ++ goto error; + converted = PyObject_CallFunction(converter, "O", item); + Py_DECREF(item); +- if (!converted) { ++ if (!converted) + break; +- } + } + } else { + Py_BEGIN_ALLOW_THREADS +@@ -374,9 +371,8 @@ + nbytes = sqlite3_column_bytes(self->statement->st, i); + buffer = PyBytes_FromStringAndSize( + sqlite3_column_blob(self->statement->st, i), nbytes); +- if (!buffer) { ++ if (!buffer) + break; +- } + converted = buffer; + } + } +@@ -389,12 +385,14 @@ + } + } + +- if (PyErr_Occurred()) { +- Py_DECREF(row); +- row = NULL; +- } ++ if (PyErr_Occurred()) ++ goto error; + + return row; ++ ++error: ++ Py_DECREF(row); ++ return NULL; + } + + /* +@@ -612,6 +610,10 @@ + while (1) { + /* Actually execute the SQL statement. */ + rc = pysqlite_step(self->statement->st, self->connection); ++ if (PyErr_Occurred()) { ++ (void)pysqlite_statement_reset(self->statement); ++ goto error; ++ } + if (rc == SQLITE_DONE || rc == SQLITE_ROW) { + /* If it worked, let's get out of the loop */ + break; +@@ -685,6 +687,8 @@ + } + + self->next_row = _pysqlite_fetch_one_row(self); ++ if (self->next_row == NULL) ++ goto error; + } else if (rc == SQLITE_DONE && !multiple) { + pysqlite_statement_reset(self->statement); + Py_CLEAR(self->statement); +@@ -807,7 +811,10 @@ + rc = SQLITE_ROW; + while (rc == SQLITE_ROW) { + rc = pysqlite_step(statement, self->connection); +- /* TODO: we probably need more error handling here */ ++ if (PyErr_Occurred()) { ++ (void)sqlite3_finalize(statement); ++ goto error; ++ } + } + + if (rc != SQLITE_DONE) { +@@ -884,6 +891,11 @@ + + if (self->statement) { + rc = pysqlite_step(self->statement->st, self->connection); ++ if (PyErr_Occurred()) { ++ (void)pysqlite_statement_reset(self->statement); ++ Py_DECREF(next_row); ++ return NULL; ++ } + if (rc != SQLITE_DONE && rc != SQLITE_ROW) { + (void)pysqlite_statement_reset(self->statement); + Py_DECREF(next_row); +@@ -895,8 +907,6 @@ + self->next_row = _pysqlite_fetch_one_row(self); + if (self->next_row == NULL) { + (void)pysqlite_statement_reset(self->statement); +- Py_DECREF(next_row); +- _pysqlite_seterror(self->connection->db, NULL); + return NULL; + } + } +diff -r c0e311e010fc Modules/_sqlite/row.c +--- a/Modules/_sqlite/row.c ++++ b/Modules/_sqlite/row.c +@@ -63,9 +63,16 @@ + return 0; + } + ++PyObject* pysqlite_row_item(pysqlite_Row* self, Py_ssize_t idx) ++{ ++ PyObject* item = PyTuple_GetItem(self->data, idx); ++ Py_XINCREF(item); ++ return item; ++} ++ + PyObject* pysqlite_row_subscript(pysqlite_Row* self, PyObject* idx) + { +- long _idx; ++ Py_ssize_t _idx; + char* key; + Py_ssize_t nitems, i; + char* compare_key; +@@ -76,7 +83,11 @@ + PyObject* item; + + if (PyLong_Check(idx)) { +- _idx = PyLong_AsLong(idx); ++ _idx = PyNumber_AsSsize_t(idx, PyExc_IndexError); ++ if (_idx == -1 && PyErr_Occurred()) ++ return NULL; ++ if (_idx < 0) ++ _idx += PyTuple_GET_SIZE(self->data); + item = PyTuple_GetItem(self->data, _idx); + Py_XINCREF(item); + return item; +@@ -198,6 +209,14 @@ + /* mp_ass_subscript */ (objobjargproc)0, + }; + ++static PySequenceMethods pysqlite_row_as_sequence = { ++ /* sq_length */ (lenfunc)pysqlite_row_length, ++ /* sq_concat */ 0, ++ /* sq_repeat */ 0, ++ /* sq_item */ (ssizeargfunc)pysqlite_row_item, ++}; ++ ++ + static PyMethodDef pysqlite_row_methods[] = { + {"keys", (PyCFunction)pysqlite_row_keys, METH_NOARGS, + PyDoc_STR("Returns the keys of the row.")}, +@@ -251,5 +270,6 @@ + { + pysqlite_RowType.tp_new = PyType_GenericNew; + pysqlite_RowType.tp_as_mapping = &pysqlite_row_as_mapping; ++ pysqlite_RowType.tp_as_sequence = &pysqlite_row_as_sequence; + return PyType_Ready(&pysqlite_RowType); + } +diff -r c0e311e010fc Modules/_ssl.c +--- a/Modules/_ssl.c ++++ b/Modules/_ssl.c +@@ -14,6 +14,8 @@ + http://bugs.python.org/issue8108#msg102867 ? + */ + ++#define PY_SSIZE_T_CLEAN ++ + #include "Python.h" + + #ifdef WITH_THREAD +@@ -3235,12 +3237,17 @@ + PySSL_RAND_add(PyObject *self, PyObject *args) + { + char *buf; +- int len; ++ Py_ssize_t len, written; + double entropy; + + if (!PyArg_ParseTuple(args, "s#d:RAND_add", &buf, &len, &entropy)) + return NULL; +- RAND_add(buf, len, entropy); ++ do { ++ written = Py_MIN(len, INT_MAX); ++ RAND_add(buf, (int)written, entropy); ++ buf += written; ++ len -= written; ++ } while (len); + Py_INCREF(Py_None); + return Py_None; + } +@@ -3409,7 +3416,7 @@ + int nid; + const char *ln, *sn; + char buf[100]; +- int buflen; ++ Py_ssize_t buflen; + + nid = OBJ_obj2nid(obj); + if (nid == NID_undef) { +diff -r c0e311e010fc Modules/_testcapimodule.c +--- a/Modules/_testcapimodule.c ++++ b/Modules/_testcapimodule.c +@@ -3104,9 +3104,9 @@ + {"pytime_object_to_timeval", test_pytime_object_to_timeval, METH_VARARGS}, + {"pytime_object_to_timespec", test_pytime_object_to_timespec, METH_VARARGS}, + {"with_tp_del", with_tp_del, METH_VARARGS}, +- {"test_pymem", ++ {"test_pymem_alloc0", + (PyCFunction)test_pymem_alloc0, METH_NOARGS}, +- {"test_pymem_alloc0", ++ {"test_pymem_setrawallocators", + (PyCFunction)test_pymem_setrawallocators, METH_NOARGS}, + {"test_pymem_setallocators", + (PyCFunction)test_pymem_setallocators, METH_NOARGS}, +diff -r c0e311e010fc Modules/_tkinter.c +--- a/Modules/_tkinter.c ++++ b/Modules/_tkinter.c +@@ -339,8 +339,10 @@ + const char *e = s + size; + PyErr_Clear(); + q = buf = (char *)PyMem_Malloc(size); +- if (buf == NULL) ++ if (buf == NULL) { ++ PyErr_NoMemory(); + return NULL; ++ } + while (s != e) { + if (s + 1 != e && s[0] == '\xc0' && s[1] == '\x80') { + *q++ = '\0'; +@@ -861,6 +863,16 @@ + }; + + ++#if PY_SIZE_MAX > INT_MAX ++#define CHECK_STRING_LENGTH(s) do { \ ++ if (s != NULL && strlen(s) >= INT_MAX) { \ ++ PyErr_SetString(PyExc_OverflowError, "string is too long"); \ ++ return NULL; \ ++ } } while(0) ++#else ++#define CHECK_STRING_LENGTH(s) ++#endif ++ + static Tcl_Obj* + AsObj(PyObject *value) + { +@@ -1279,6 +1291,7 @@ + if (!PyArg_ParseTuple(args, "s:eval", &script)) + return NULL; + ++ CHECK_STRING_LENGTH(script); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1302,6 +1315,7 @@ + if (!PyArg_ParseTuple(args, "s:evalfile", &fileName)) + return NULL; + ++ CHECK_STRING_LENGTH(fileName); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1322,9 +1336,10 @@ + PyObject *res = NULL; + int err; + +- if (!PyArg_ParseTuple(args, "s", &script)) ++ if (!PyArg_ParseTuple(args, "s:record", &script)) + return NULL; + ++ CHECK_STRING_LENGTH(script); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1345,6 +1360,7 @@ + + if (!PyArg_ParseTuple(args, "s:adderrorinfo", &msg)) + return NULL; ++ CHECK_STRING_LENGTH(msg); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1528,6 +1544,8 @@ + if (!PyArg_ParseTuple(args, "ssO:setvar", + &name1, &name2, &newValue)) + return NULL; ++ CHECK_STRING_LENGTH(name1); ++ CHECK_STRING_LENGTH(name2); + /* XXX must hold tcl lock already??? */ + newval = AsObj(newValue); + ENTER_TCL +@@ -1573,6 +1591,7 @@ + varname_converter, &name1, &name2)) + return NULL; + ++ CHECK_STRING_LENGTH(name2); + ENTER_TCL + tres = Tcl_GetVar2Ex(Tkapp_Interp(self), name1, name2, flags); + ENTER_OVERLAP +@@ -1615,6 +1634,8 @@ + if (!PyArg_ParseTuple(args, "s|s:unsetvar", &name1, &name2)) + return NULL; + ++ CHECK_STRING_LENGTH(name1); ++ CHECK_STRING_LENGTH(name2); + ENTER_TCL + code = Tcl_UnsetVar2(Tkapp_Interp(self), name1, name2, flags); + ENTER_OVERLAP +@@ -1660,6 +1681,7 @@ + } + if (!PyArg_ParseTuple(args, "s:getint", &s)) + return NULL; ++ CHECK_STRING_LENGTH(s); + if (Tcl_GetInt(Tkapp_Interp(self), s, &v) == TCL_ERROR) + return Tkinter_Error(self); + return Py_BuildValue("i", v); +@@ -1680,6 +1702,7 @@ + } + if (!PyArg_ParseTuple(args, "s:getdouble", &s)) + return NULL; ++ CHECK_STRING_LENGTH(s); + if (Tcl_GetDouble(Tkapp_Interp(self), s, &v) == TCL_ERROR) + return Tkinter_Error(self); + return Py_BuildValue("d", v); +@@ -1700,6 +1723,7 @@ + } + if (!PyArg_ParseTuple(args, "s:getboolean", &s)) + return NULL; ++ CHECK_STRING_LENGTH(s); + if (Tcl_GetBoolean(Tkapp_Interp(self), s, &v) == TCL_ERROR) + return Tkinter_Error(self); + return PyBool_FromLong(v); +@@ -1715,6 +1739,7 @@ + if (!PyArg_ParseTuple(args, "s:exprstring", &s)) + return NULL; + ++ CHECK_STRING_LENGTH(s); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1739,6 +1764,7 @@ + if (!PyArg_ParseTuple(args, "s:exprlong", &s)) + return NULL; + ++ CHECK_STRING_LENGTH(s); + CHECK_TCL_APPARTMENT; + + ENTER_TCL +@@ -1762,6 +1788,7 @@ + + if (!PyArg_ParseTuple(args, "s:exprdouble", &s)) + return NULL; ++ CHECK_STRING_LENGTH(s); + CHECK_TCL_APPARTMENT; + PyFPE_START_PROTECT("Tkapp_ExprDouble", return 0) + ENTER_TCL +@@ -1786,6 +1813,7 @@ + + if (!PyArg_ParseTuple(args, "s:exprboolean", &s)) + return NULL; ++ CHECK_STRING_LENGTH(s); + CHECK_TCL_APPARTMENT; + ENTER_TCL + retval = Tcl_ExprBoolean(Tkapp_Interp(self), s, &v); +@@ -1838,6 +1866,7 @@ + if (!PyArg_ParseTuple(args, "et:splitlist", "utf-8", &list)) + return NULL; + ++ CHECK_STRING_LENGTH(list); + if (Tcl_SplitList(Tkapp_Interp(self), list, + &argc, &argv) == TCL_ERROR) { + PyMem_Free(list); +@@ -1899,6 +1928,7 @@ + + if (!PyArg_ParseTuple(args, "et:split", "utf-8", &list)) + return NULL; ++ CHECK_STRING_LENGTH(list); + v = Split(list); + PyMem_Free(list); + return v; +@@ -2030,6 +2060,7 @@ + + if (!PyArg_ParseTuple(args, "sO:createcommand", &cmdName, &func)) + return NULL; ++ CHECK_STRING_LENGTH(cmdName); + if (!PyCallable_Check(func)) { + PyErr_SetString(PyExc_TypeError, "command not callable"); + return NULL; +@@ -2091,6 +2122,7 @@ + + if (!PyArg_ParseTuple(args, "s:deletecommand", &cmdName)) + return NULL; ++ CHECK_STRING_LENGTH(cmdName); + + #ifdef WITH_THREAD + if (self->threaded && self->thread_id != Tcl_GetCurrentThread()) { +@@ -2782,6 +2814,10 @@ + &interactive, &wantobjects, &wantTk, + &sync, &use)) + return NULL; ++ CHECK_STRING_LENGTH(screenName); ++ CHECK_STRING_LENGTH(baseName); ++ CHECK_STRING_LENGTH(className); ++ CHECK_STRING_LENGTH(use); + + return (PyObject *) Tkapp_New(screenName, className, + interactive, wantobjects, wantTk, +diff -r c0e311e010fc Modules/getpath.c +--- a/Modules/getpath.c ++++ b/Modules/getpath.c +@@ -734,6 +734,11 @@ + + bufsz += wcslen(zip_path) + 1; + bufsz += wcslen(exec_prefix) + 1; ++ /* When running from the build directory, add room for the Modules ++ * subdirectory too. ++ */ ++ if (efound == -1) ++ bufsz += wcslen(argv0_path) + wcslen(L"Modules") + 2; + + buf = (wchar_t *)PyMem_Malloc(bufsz * sizeof(wchar_t)); + if (buf == NULL) { +@@ -781,6 +786,15 @@ + + /* Finally, on goes the directory for dynamic-load modules */ + wcscat(buf, exec_prefix); ++ /* And, if we run from a build directory, the Modules directory (for ++ * modules built with Modules/Setup.) ++ */ ++ if (efound == -1) { ++ wcscat(buf, delimiter); ++ wcscat(buf, argv0_path); ++ wcscat(buf, separator); ++ wcscat(buf, L"Modules"); ++ } + + /* And publish the results */ + module_search_path = buf; +diff -r c0e311e010fc Modules/hashtable.c +--- a/Modules/hashtable.c ++++ b/Modules/hashtable.c +@@ -233,11 +233,12 @@ + nchains++; + } + } +- printf("hash table %p: entries=%zu/%zu (%.0f%%), ", ++ printf("hash table %p: entries=%" ++ PY_FORMAT_SIZE_T "u/%" PY_FORMAT_SIZE_T "u (%.0f%%), ", + ht, ht->entries, ht->num_buckets, load * 100.0); + if (nchains) + printf("avg_chain_len=%.1f, ", (double)total_chain_len / nchains); +- printf("max_chain_len=%zu, %zu kB\n", ++ printf("max_chain_len=%" PY_FORMAT_SIZE_T "u, %" PY_FORMAT_SIZE_T "u kB\n", + max_chain_len, size / 1024); + } + #endif +diff -r c0e311e010fc Modules/itertoolsmodule.c +--- a/Modules/itertoolsmodule.c ++++ b/Modules/itertoolsmodule.c +@@ -4109,14 +4109,17 @@ + { + repeatobject *ro; + PyObject *element; +- Py_ssize_t cnt = -1; ++ Py_ssize_t cnt = -1, n_kwds = 0; + static char *kwargs[] = {"object", "times", NULL}; + + if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|n:repeat", kwargs, + &element, &cnt)) + return NULL; + +- if (PyTuple_Size(args) == 2 && cnt < 0) ++ if (kwds != NULL) ++ n_kwds = PyDict_Size(kwds); ++ /* Does user supply times argument? */ ++ if ((PyTuple_Size(args) + n_kwds == 2) && cnt < 0) + cnt = 0; + + ro = (repeatobject *)type->tp_alloc(type, 0); +diff -r c0e311e010fc Modules/makesetup +--- a/Modules/makesetup ++++ b/Modules/makesetup +@@ -217,7 +217,7 @@ + *) src='$(srcdir)/'"$srcdir/$src";; + esac + case $doconfig in +- no) cc="$cc \$(CCSHARED) \$(CFLAGS) \$(CPPFLAGS)";; ++ no) cc="$cc \$(CCSHARED) \$(PY_CFLAGS) \$(PY_CPPFLAGS)";; + *) + cc="$cc \$(PY_CORE_CFLAGS)";; + esac +@@ -229,11 +229,7 @@ + esac + for mod in $mods + do +- case $objs in +- *$mod.o*) base=$mod;; +- *) base=${mod}module;; +- esac +- file="$srcdir/$base\$(SO)" ++ file="$srcdir/$mod\$(EXT_SUFFIX)" + case $doconfig in + no) SHAREDMODS="$SHAREDMODS $file";; + esac +diff -r c0e311e010fc Modules/readline.c +--- a/Modules/readline.c ++++ b/Modules/readline.c +@@ -1019,6 +1019,21 @@ + + mod_state->begidx = PyLong_FromLong(0L); + mod_state->endidx = PyLong_FromLong(0L); ++ ++#ifndef __APPLE__ ++ if (!isatty(STDOUT_FILENO)) { ++ /* Issue #19884: stdout is no a terminal. Disable meta modifier ++ keys to not write the ANSI sequence "\033[1034h" into stdout. On ++ terminals supporting 8 bit characters like TERM=xterm-256color ++ (which is now the default Fedora since Fedora 18), the meta key is ++ used to enable support of 8 bit characters (ANSI sequence ++ "\033[1034h"). ++ ++ With libedit, this call makes readline() crash. */ ++ rl_variable_bind ("enable-meta-key", "off"); ++ } ++#endif ++ + /* Initialize (allows .inputrc to override) + * + * XXX: A bug in the readline-2.2 library causes a memory leak +diff -r c0e311e010fc Modules/socketmodule.c +--- a/Modules/socketmodule.c ++++ b/Modules/socketmodule.c +@@ -33,8 +33,8 @@ + - socket.ntohl(32 bit value) --> new int object + - socket.htons(16 bit value) --> new int object + - socket.htonl(32 bit value) --> new int object +-- socket.getaddrinfo(host, port [, family, socktype, proto, flags]) +- --> List of (family, socktype, proto, canonname, sockaddr) ++- socket.getaddrinfo(host, port [, family, type, proto, flags]) ++ --> List of (family, type, proto, canonname, sockaddr) + - socket.getnameinfo(sockaddr, flags) --> (host, port) + - socket.AF_INET, socket.SOCK_STREAM, etc.: constants from + - socket.has_ipv6: boolean value indicating if IPv6 is supported +@@ -5292,8 +5292,8 @@ + } + + PyDoc_STRVAR(getaddrinfo_doc, +-"getaddrinfo(host, port [, family, socktype, proto, flags])\n\ +- -> list of (family, socktype, proto, canonname, sockaddr)\n\ ++"getaddrinfo(host, port [, family, type, proto, flags])\n\ ++ -> list of (family, type, proto, canonname, sockaddr)\n\ + \n\ + Resolve host and port into addrinfo struct."); + +diff -r c0e311e010fc Modules/unicodedata.c +--- a/Modules/unicodedata.c ++++ b/Modules/unicodedata.c +@@ -13,6 +13,8 @@ + + ------------------------------------------------------------------------ */ + ++#define PY_SSIZE_T_CLEAN ++ + #include "Python.h" + #include "ucnhash.h" + #include "structmember.h" +@@ -1271,12 +1273,16 @@ + Py_UCS4 code; + + char* name; +- int namelen; ++ Py_ssize_t namelen; + unsigned int index; + if (!PyArg_ParseTuple(args, "s#:lookup", &name, &namelen)) + return NULL; ++ if (namelen > INT_MAX) { ++ PyErr_SetString(PyExc_KeyError, "name too long"); ++ return NULL; ++ } + +- if (!_getcode(self, name, namelen, &code, 1)) { ++ if (!_getcode(self, name, (int)namelen, &code, 1)) { + PyErr_Format(PyExc_KeyError, "undefined character name '%s'", name); + return NULL; + } +diff -r c0e311e010fc Modules/zlibmodule.c +--- a/Modules/zlibmodule.c ++++ b/Modules/zlibmodule.c +@@ -3,6 +3,7 @@ + + /* Windows users: read Python's PCbuild\readme.txt */ + ++#define PY_SSIZE_T_CLEAN + + #include "Python.h" + #include "structmember.h" +diff -r c0e311e010fc Objects/abstract.c +--- a/Objects/abstract.c ++++ b/Objects/abstract.c +@@ -2191,9 +2191,8 @@ + return null_error(); + + func = PyObject_GetAttrString(o, name); +- if (func == NULL) { +- return 0; +- } ++ if (func == NULL) ++ return NULL; + + va_start(va, format); + retval = callmethod(func, format, va, 0); +@@ -2213,9 +2212,8 @@ + return null_error(); + + func = _PyObject_GetAttrId(o, name); +- if (func == NULL) { +- return 0; +- } ++ if (func == NULL) ++ return NULL; + + va_start(va, format); + retval = callmethod(func, format, va, 0); +@@ -2235,9 +2233,8 @@ + return null_error(); + + func = PyObject_GetAttrString(o, name); +- if (func == NULL) { +- return 0; +- } ++ if (func == NULL) ++ return NULL; + va_start(va, format); + retval = callmethod(func, format, va, 1); + va_end(va); +diff -r c0e311e010fc Objects/exceptions.c +--- a/Objects/exceptions.c ++++ b/Objects/exceptions.c +@@ -1254,6 +1254,9 @@ + * SyntaxError extends Exception + */ + ++/* Helper function to customise error message for some syntax errors */ ++static int _report_missing_parentheses(PySyntaxErrorObject *self); ++ + static int + SyntaxError_init(PySyntaxErrorObject *self, PyObject *args, PyObject *kwds) + { +@@ -1298,6 +1301,13 @@ + Py_INCREF(self->text); + + Py_DECREF(info); ++ ++ /* Issue #21669: Custom error for 'print' & 'exec' as statements */ ++ if (self->text && PyUnicode_Check(self->text)) { ++ if (_report_missing_parentheses(self) < 0) { ++ return -1; ++ } ++ } + } + return 0; + } +@@ -2783,3 +2793,128 @@ + PyErr_Restore(new_exc, new_val, new_tb); + return new_val; + } ++ ++ ++/* To help with migration from Python 2, SyntaxError.__init__ applies some ++ * heuristics to try to report a more meaningful exception when print and ++ * exec are used like statements. ++ * ++ * The heuristics are currently expected to detect the following cases: ++ * - top level statement ++ * - statement in a nested suite ++ * - trailing section of a one line complex statement ++ * ++ * They're currently known not to trigger: ++ * - after a semi-colon ++ * ++ * The error message can be a bit odd in cases where the "arguments" are ++ * completely illegal syntactically, but that isn't worth the hassle of ++ * fixing. ++ * ++ * We also can't do anything about cases that are legal Python 3 syntax ++ * but mean something entirely different from what they did in Python 2 ++ * (omitting the arguments entirely, printing items preceded by a unary plus ++ * or minus, using the stream redirection syntax). ++ */ ++ ++static int ++_check_for_legacy_statements(PySyntaxErrorObject *self, Py_ssize_t start) ++{ ++ /* Return values: ++ * -1: an error occurred ++ * 0: nothing happened ++ * 1: the check triggered & the error message was changed ++ */ ++ static PyObject *print_prefix = NULL; ++ static PyObject *exec_prefix = NULL; ++ Py_ssize_t text_len = PyUnicode_GET_LENGTH(self->text); ++ int kind = PyUnicode_KIND(self->text); ++ void *data = PyUnicode_DATA(self->text); ++ ++ /* Ignore leading whitespace */ ++ while (start < text_len) { ++ Py_UCS4 ch = PyUnicode_READ(kind, data, start); ++ if (!Py_UNICODE_ISSPACE(ch)) ++ break; ++ start++; ++ } ++ /* Checking against an empty or whitespace-only part of the string */ ++ if (start == text_len) { ++ return 0; ++ } ++ ++ /* Check for legacy print statements */ ++ if (print_prefix == NULL) { ++ print_prefix = PyUnicode_InternFromString("print "); ++ if (print_prefix == NULL) { ++ return -1; ++ } ++ } ++ if (PyUnicode_Tailmatch(self->text, print_prefix, ++ start, text_len, -1)) { ++ Py_CLEAR(self->msg); ++ self->msg = PyUnicode_FromString( ++ "Missing parentheses in call to 'print'"); ++ return 1; ++ } ++ ++ /* Check for legacy exec statements */ ++ if (exec_prefix == NULL) { ++ exec_prefix = PyUnicode_InternFromString("exec "); ++ if (exec_prefix == NULL) { ++ return -1; ++ } ++ } ++ if (PyUnicode_Tailmatch(self->text, exec_prefix, ++ start, text_len, -1)) { ++ Py_CLEAR(self->msg); ++ self->msg = PyUnicode_FromString( ++ "Missing parentheses in call to 'exec'"); ++ return 1; ++ } ++ /* Fall back to the default error message */ ++ return 0; ++} ++ ++static int ++_report_missing_parentheses(PySyntaxErrorObject *self) ++{ ++ Py_UCS4 left_paren = 40; ++ Py_ssize_t left_paren_index; ++ Py_ssize_t text_len = PyUnicode_GET_LENGTH(self->text); ++ int legacy_check_result = 0; ++ ++ /* Skip entirely if there is an opening parenthesis */ ++ left_paren_index = PyUnicode_FindChar(self->text, left_paren, ++ 0, text_len, 1); ++ if (left_paren_index < -1) { ++ return -1; ++ } ++ if (left_paren_index != -1) { ++ /* Use default error message for any line with an opening paren */ ++ return 0; ++ } ++ /* Handle the simple statement case */ ++ legacy_check_result = _check_for_legacy_statements(self, 0); ++ if (legacy_check_result < 0) { ++ return -1; ++ ++ } ++ if (legacy_check_result == 0) { ++ /* Handle the one-line complex statement case */ ++ Py_UCS4 colon = 58; ++ Py_ssize_t colon_index; ++ colon_index = PyUnicode_FindChar(self->text, colon, ++ 0, text_len, 1); ++ if (colon_index < -1) { ++ return -1; ++ } ++ if (colon_index >= 0 && colon_index < text_len) { ++ /* Check again, starting from just after the colon */ ++ if (_check_for_legacy_statements(self, colon_index+1) < 0) { ++ return -1; ++ } ++ } ++ } ++ return 0; ++} +diff -r c0e311e010fc Objects/frameobject.c +--- a/Objects/frameobject.c ++++ b/Objects/frameobject.c +@@ -786,7 +786,7 @@ + PyObject *key = PyTuple_GET_ITEM(map, j); + PyObject *value = values[j]; + assert(PyUnicode_Check(key)); +- if (deref) { ++ if (deref && value != NULL) { + assert(PyCell_Check(value)); + value = PyCell_GET(value); + } +diff -r c0e311e010fc Objects/stringlib/README.txt +--- a/Objects/stringlib/README.txt ++++ b/Objects/stringlib/README.txt +@@ -1,4 +1,4 @@ +-bits shared by the stringobject and unicodeobject implementations (and ++bits shared by the bytesobject and unicodeobject implementations (and + possibly other modules, in a not too distant future). + + the stuff in here is included into relevant places; see the individual +diff -r c0e311e010fc Objects/unicodeobject.c +--- a/Objects/unicodeobject.c ++++ b/Objects/unicodeobject.c +@@ -1011,17 +1011,19 @@ + } + else + data = unicode->data.any; +- printf("%s: len=%zu, ",unicode_kind_name(op), ascii->length); ++ printf("%s: len=%" PY_FORMAT_SIZE_T "u, ", ++ unicode_kind_name(op), ascii->length); + + if (ascii->wstr == data) + printf("shared "); + printf("wstr=%p", ascii->wstr); + + if (!(ascii->state.ascii == 1 && ascii->state.compact == 1)) { +- printf(" (%zu), ", compact->wstr_length); ++ printf(" (%" PY_FORMAT_SIZE_T "u), ", compact->wstr_length); + if (!ascii->state.compact && compact->utf8 == unicode->data.any) + printf("shared "); +- printf("utf8=%p (%zu)", compact->utf8, compact->utf8_length); ++ printf("utf8=%p (%" PY_FORMAT_SIZE_T "u)", ++ compact->utf8, compact->utf8_length); + } + printf(", data=%p\n", data); + } +diff -r c0e311e010fc PC/pyconfig.h +--- a/PC/pyconfig.h ++++ b/PC/pyconfig.h +@@ -390,7 +390,7 @@ + #else + /* VC6, VS 2002 and eVC4 don't support the C99 LL suffix for 64-bit integer literals */ + #define Py_LL(x) x##I64 +-#endif /* _MSC_VER > 1200 */ ++#endif /* _MSC_VER > 1300 */ + #endif /* _MSC_VER */ + + #endif +@@ -436,6 +436,11 @@ + /* Define to 1 if you have the `copysign' function. */ + #define HAVE_COPYSIGN 1 + ++/* Define to 1 if you have the `round' function. */ ++#if _MSC_VER >= 1800 ++#define HAVE_ROUND 1 ++#endif ++ + /* Define to 1 if you have the `isinf' macro. */ + #define HAVE_DECL_ISINF 1 + +diff -r c0e311e010fc PC/winreg.c +--- a/PC/winreg.c ++++ b/PC/winreg.c +@@ -871,8 +871,10 @@ + /* ALSO handle ALL unknown data types here. Even if we can't + support it natively, we should handle the bits. */ + default: +- if (value == Py_None) ++ if (value == Py_None) { + *retDataSize = 0; ++ *retDataBuf = NULL; ++ } + else { + Py_buffer view; + +diff -r c0e311e010fc PCbuild/build_ssl.py +--- a/PCbuild/build_ssl.py ++++ b/PCbuild/build_ssl.py +@@ -66,7 +66,7 @@ + # Fetch SSL directory from VC properties + def get_ssl_dir(): + propfile = (os.path.join(os.path.dirname(__file__), 'pyproject.props')) +- with open(propfile) as f: ++ with open(propfile, encoding='utf-8-sig') as f: + m = re.search('openssl-([^<]+)<', f.read()) + return "..\..\openssl-"+m.group(1) + +diff -r c0e311e010fc PCbuild/pyproject.props +--- a/PCbuild/pyproject.props ++++ b/PCbuild/pyproject.props +@@ -20,7 +20,7 @@ + $(externalsDir)\sqlite-3.8.3.1 + $(externalsDir)\bzip2-1.0.6 + $(externalsDir)\xz-5.0.5 +- $(externalsDir)\openssl-1.0.1g ++ $(externalsDir)\openssl-1.0.1h + $(externalsDir)\tcltk + $(externalsDir)\tcltk64 + $(tcltkDir)\lib\tcl86t.lib;$(tcltkDir)\lib\tk86t.lib +diff -r c0e311e010fc PCbuild/readme.txt +--- a/PCbuild/readme.txt ++++ b/PCbuild/readme.txt +@@ -9,16 +9,19 @@ + + Visual C++ 2010 Express Edition + Required for building 32-bit Debug and Release configuration builds. +- This edition does not support "solution folders", which pcbuild.sln +- uses; this will not prevent building. ++ The Python build solution pcbuild.sln makes use of Solution Folders, ++ which this edition does not support. Any time pcbuild.sln is opened ++ or reloaded by Visual C++, a warning about Solution Folders will be ++ displayed which can be safely dismissed with no impact on your ++ ability to build Python. + Visual Studio 2010 Professional Edition + Required for building 64-bit Debug and Release configuration builds + Visual Studio 2010 Premium Edition + Required for building Release configuration builds that make use of + Profile Guided Optimization (PGO), on either platform. + +-The official Python releases are built with PGO using Visual Studio 2010 +-Ultimate Edition. ++Installing Service Pack 1 for Visual Studio 2010 is highly recommended ++to avoid LNK1123 errors. + + All you need to do to build is open the solution "pcbuild.sln" in Visual + Studio, select the desired combination of configuration and platform, +@@ -46,7 +49,7 @@ + development of CPython, you will most likely use this configuration. + PGInstrument, PGUpdate + Used to build Python in Release configuration using PGO, which +- requires Professional Edition of Visual Studio. See the "Profile ++ requires Premium Edition of Visual Studio. See the "Profile + Guided Optimization" section below for more information. Build + output from each of these configurations lands in its own + sub-directory of this directory. The official Python releases are +@@ -168,7 +171,7 @@ + Homepage: + http://tukaani.org/xz/ + _ssl +- Python wrapper for version 1.0.1g of the OpenSSL secure sockets ++ Python wrapper for version 1.0.1h of the OpenSSL secure sockets + library, which is built by ssl.vcxproj + Homepage: + http://www.openssl.org/ +diff -r c0e311e010fc Parser/tokenizer.c +--- a/Parser/tokenizer.c ++++ b/Parser/tokenizer.c +@@ -1597,15 +1597,24 @@ + } while (isdigit(c)); + } + if (c == 'e' || c == 'E') { +- exponent: ++ int e; ++ exponent: ++ e = c; + /* Exponent part */ + c = tok_nextc(tok); +- if (c == '+' || c == '-') ++ if (c == '+' || c == '-') { + c = tok_nextc(tok); +- if (!isdigit(c)) { +- tok->done = E_TOKEN; ++ if (!isdigit(c)) { ++ tok->done = E_TOKEN; ++ tok_backup(tok, c); ++ return ERRORTOKEN; ++ } ++ } else if (!isdigit(c)) { + tok_backup(tok, c); +- return ERRORTOKEN; ++ tok_backup(tok, e); ++ *p_start = tok->start; ++ *p_end = tok->cur; ++ return NUMBER; + } + do { + c = tok_nextc(tok); +diff -r c0e311e010fc Python/bltinmodule.c +--- a/Python/bltinmodule.c ++++ b/Python/bltinmodule.c +@@ -1327,7 +1327,7 @@ + PyDoc_STRVAR(len_doc, + "len(object)\n\ + \n\ +-Return the number of items of a sequence or mapping."); ++Return the number of items of a sequence or collection."); + + + static PyObject * +@@ -1454,10 +1454,12 @@ + } + + PyDoc_STRVAR(min_doc, +-"min(iterable[, key=func]) -> value\n\ +-min(a, b, c, ...[, key=func]) -> value\n\ ++"min(iterable, *[, default=obj, key=func]) -> value\n\ ++min(arg1, arg2, *args, *[, key=func]) -> value\n\ + \n\ +-With a single iterable argument, return its smallest item.\n\ ++With a single iterable argument, return its smallest item. The\n\ ++default keyword-only argument specifies an object to return if\n\ ++the provided iterable is empty.\n\ + With two or more arguments, return the smallest argument."); + + +@@ -1468,10 +1470,12 @@ + } + + PyDoc_STRVAR(max_doc, +-"max(iterable[, key=func]) -> value\n\ +-max(a, b, c, ...[, key=func]) -> value\n\ ++"max(iterable, *[, default=obj, key=func]) -> value\n\ ++max(arg1, arg2, *args, *[, key=func]) -> value\n\ + \n\ +-With a single iterable argument, return its largest item.\n\ ++With a single iterable argument, return its biggest item. The\n\ ++default keyword-only argument specifies an object to return if\n\ ++the provided iterable is empty.\n\ + With two or more arguments, return the largest argument."); + + +diff -r c0e311e010fc Python/ceval.c +--- a/Python/ceval.c ++++ b/Python/ceval.c +@@ -1267,6 +1267,13 @@ + /* Other threads may run now */ + + take_gil(tstate); ++ ++ /* Check if we should make a quick exit. */ ++ if (_Py_Finalizing && _Py_Finalizing != tstate) { ++ drop_gil(tstate); ++ PyThread_exit_thread(); ++ } ++ + if (PyThreadState_Swap(tstate) != NULL) + Py_FatalError("ceval: orphan tstate"); + } +diff -r c0e311e010fc Python/compile.c +--- a/Python/compile.c ++++ b/Python/compile.c +@@ -3856,12 +3856,16 @@ + target_depth = depth; + if (instr->i_opcode == FOR_ITER) { + target_depth = depth-2; +- } else if (instr->i_opcode == SETUP_FINALLY || +- instr->i_opcode == SETUP_EXCEPT) { ++ } ++ else if (instr->i_opcode == SETUP_FINALLY || ++ instr->i_opcode == SETUP_EXCEPT) { + target_depth = depth+3; + if (target_depth > maxdepth) + maxdepth = target_depth; + } ++ else if (instr->i_opcode == JUMP_IF_TRUE_OR_POP || ++ instr->i_opcode == JUMP_IF_FALSE_OR_POP) ++ depth = depth - 1; + maxdepth = stackdepth_walk(c, instr->i_target, + target_depth, maxdepth); + if (instr->i_opcode == JUMP_ABSOLUTE || +diff -r c0e311e010fc Python/import.c +--- a/Python/import.c ++++ b/Python/import.c +@@ -460,7 +460,7 @@ + while (PyDict_Next(modules, &pos, &key, &value)) { + if (PyModule_Check(value)) { + if (Py_VerboseFlag && PyUnicode_Check(key)) +- PySys_FormatStderr("# cleanup[2] removing %U\n", key, value); ++ PySys_FormatStderr("# cleanup[2] removing %U\n", key); + STORE_MODULE_WEAKREF(key, value); + PyDict_SetItem(modules, key, Py_None); + } +@@ -904,6 +904,7 @@ + &PyId__fix_up_module, + d, name, pathname, cpathname, NULL); + if (res != NULL) { ++ Py_DECREF(res); + res = exec_code_in_module(name, d, co); + } + return res; +diff -r c0e311e010fc Python/sysmodule.c +--- a/Python/sysmodule.c ++++ b/Python/sysmodule.c +@@ -1546,7 +1546,7 @@ + #define STRIFY(name) QUOTE(name) + #define MAJOR STRIFY(PY_MAJOR_VERSION) + #define MINOR STRIFY(PY_MINOR_VERSION) +-#define TAG NAME "-" MAJOR MINOR; ++#define TAG NAME "-" MAJOR MINOR + const char *_PySys_ImplCacheTag = TAG; + #undef NAME + #undef QUOTE +diff -r c0e311e010fc Tools/buildbot/external-amd64.bat +--- a/Tools/buildbot/external-amd64.bat ++++ b/Tools/buildbot/external-amd64.bat +@@ -6,16 +6,23 @@ + + if not exist tcltk64\bin\tcl86tg.dll ( + cd tcl-8.6.1.0\win +- nmake -f makefile.vc DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean all +- nmake -f makefile.vc DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install ++ nmake -f makefile.vc OPTS=symbols MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 clean core shell dlls ++ nmake -f makefile.vc OPTS=symbols MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 install-binaries install-libraries + cd ..\.. + ) + + if not exist tcltk64\bin\tk86tg.dll ( +- cd tk-8.6.1.0\win +- nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 clean +- nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 all +- nmake -f makefile.vc OPTS=noxp DEBUG=1 MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 install ++ cd tk-8.6.1.0\win ++ nmake -f makefile.vc OPTS=symbols MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 clean ++ nmake -f makefile.vc OPTS=symbols MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 all ++ nmake -f makefile.vc OPTS=symbols MACHINE=AMD64 INSTALLDIR=..\..\tcltk64 TCLDIR=..\..\tcl-8.6.1.0 install-binaries install-libraries + cd ..\.. + ) + ++if not exist tcltk64\lib\tix8.4.3\tix84g.dll ( ++ cd tix-8.4.3.4\win ++ nmake -f python.mak DEBUG=1 MACHINE=AMD64 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk64 clean ++ nmake -f python.mak DEBUG=1 MACHINE=AMD64 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk64 all ++ nmake -f python.mak DEBUG=1 MACHINE=AMD64 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk64 install ++ cd ..\.. ++) +diff -r c0e311e010fc Tools/buildbot/external-common.bat +--- a/Tools/buildbot/external-common.bat ++++ b/Tools/buildbot/external-common.bat +@@ -7,15 +7,12 @@ + @rem if exist bzip2-1.0.6 rd /s/q bzip2-1.0.6 + @rem if exist tcltk rd /s/q tcltk + @rem if exist tcltk64 rd /s/q tcltk64 +-@rem if exist tcl8.4.12 rd /s/q tcl8.4.12 +-@rem if exist tcl8.4.16 rd /s/q tcl8.4.16 +-@rem if exist tcl-8.4.18.1 rd /s/q tcl-8.4.18.1 +-@rem if exist tk8.4.12 rd /s/q tk8.4.12 +-@rem if exist tk8.4.16 rd /s/q tk8.4.16 +-@rem if exist tk-8.4.18.1 rd /s/q tk-8.4.18.1 ++@rem if exist tcl-8.6.1.0 rd /s/q tcl-8.6.1.0 ++@rem if exist tk-8.6.1.0 rd /s/q tk-8.6.1.0 ++@rem if exist tix-8.4.3.4 rd /s/q tix-8.4.3.4 + @rem if exist db-4.4.20 rd /s/q db-4.4.20 +-@rem if exist openssl-1.0.1e rd /s/q openssl-1.0.1g +-@rem if exist sqlite-3.7.12 rd /s/q sqlite-3.7.12 ++@rem if exist openssl-1.0.1h rd /s/q openssl-1.0.1h ++@rem if exist sqlite-3.7.12 rd /s/q sqlite-3.7.12 + + @rem bzip + if not exist bzip2-1.0.6 ( +@@ -24,9 +21,9 @@ + ) + + @rem OpenSSL +-if not exist openssl-1.0.1g ( +- rd /s/q openssl-1.0.1e +- svn export http://svn.python.org/projects/external/openssl-1.0.1g ++if not exist openssl-1.0.1h ( ++ rd /s/q openssl-1.0.1g ++ svn export http://svn.python.org/projects/external/openssl-1.0.1h + ) + + @rem tcl/tk +@@ -35,6 +32,7 @@ + svn export http://svn.python.org/projects/external/tcl-8.6.1.0 + ) + if not exist tk-8.6.1.0 svn export http://svn.python.org/projects/external/tk-8.6.1.0 ++if not exist tix-8.4.3.4 svn export http://svn.python.org/projects/external/tix-8.4.3.4 + + @rem sqlite3 + if not exist sqlite-3.8.3.1 ( +diff -r c0e311e010fc Tools/buildbot/external.bat +--- a/Tools/buildbot/external.bat ++++ b/Tools/buildbot/external.bat +@@ -7,15 +7,23 @@ + if not exist tcltk\bin\tcl86tg.dll ( + @rem all and install need to be separate invocations, otherwise nmakehlp is not found on install + cd tcl-8.6.1.0\win +- nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk clean all +- nmake -f makefile.vc DEBUG=1 INSTALLDIR=..\..\tcltk install ++ nmake -f makefile.vc OPTS=symbols INSTALLDIR=..\..\tcltk clean core shell dlls ++ nmake -f makefile.vc OPTS=symbols INSTALLDIR=..\..\tcltk install-binaries install-libraries + cd ..\.. + ) + + if not exist tcltk\bin\tk86tg.dll ( + cd tk-8.6.1.0\win +- nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 clean +- nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 all +- nmake -f makefile.vc OPTS=noxp DEBUG=1 INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 install ++ nmake -f makefile.vc OPTS=symbols INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 clean ++ nmake -f makefile.vc OPTS=symbols INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 all ++ nmake -f makefile.vc OPTS=symbols INSTALLDIR=..\..\tcltk TCLDIR=..\..\tcl-8.6.1.0 install-binaries install-libraries + cd ..\.. + ) ++ ++if not exist tcltk\lib\tix8.4.3\tix84g.dll ( ++ cd tix-8.4.3.4\win ++ nmake -f python.mak DEBUG=1 MACHINE=IX86 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk clean ++ nmake -f python.mak DEBUG=1 MACHINE=IX86 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk all ++ nmake -f python.mak DEBUG=1 MACHINE=IX86 TCL_DIR=..\..\tcl-8.6.1.0 TK_DIR=..\..\tk-8.6.1.0 INSTALL_DIR=..\..\tcltk install ++ cd ..\.. ++) +diff -r c0e311e010fc Tools/clinic/clinic.py +--- a/Tools/clinic/clinic.py ++++ b/Tools/clinic/clinic.py +@@ -2044,11 +2044,9 @@ + # automatically add converter for default format unit + # (but without stomping on the existing one if it's already + # set, in case you subclass) +- if ((cls.format_unit != 'O&') and ++ if ((cls.format_unit not in ('O&', '')) and + (cls.format_unit not in legacy_converters)): + legacy_converters[cls.format_unit] = cls +- if cls.format_unit: +- legacy_converters[cls.format_unit] = cls + return cls + + def add_legacy_c_converter(format_unit, **kwargs): +diff -r c0e311e010fc Tools/msi/msi.py +--- a/Tools/msi/msi.py ++++ b/Tools/msi/msi.py +@@ -1320,27 +1320,38 @@ + add_data(db, "RemoveFile", + [("MenuDir", "TARGETDIR", None, "MenuDir", 2)]) + tcltkshortcuts = [] ++ if msilib.Win64: ++ bitted = "64 bit" ++ else: ++ bitted = "32 bit" + if have_tcl: + tcltkshortcuts = [ +- ("IDLE", "MenuDir", "IDLE|IDLE (Python GUI)", "pythonw.exe", +- tcltk.id, r'"[TARGETDIR]Lib\idlelib\idle.pyw"', None, None, "python_icon.exe", 0, None, "TARGETDIR"), ++ ("IDLE", "MenuDir", ++ "IDLE|IDLE (Python "+short_version+" GUI - "+bitted+")", ++ "pythonw.exe", tcltk.id, r'"[TARGETDIR]Lib\idlelib\idle.pyw"', ++ None, None, "python_icon.exe", 0, None, "TARGETDIR"), + ] + add_data(db, "Shortcut", + tcltkshortcuts + + [# Advertised shortcuts: targets are features, not files +- ("Python", "MenuDir", "PYTHON|Python (command line)", "python.exe", +- default_feature.id, None, None, None, "python_icon.exe", 2, None, "TARGETDIR"), ++ ("Python", "MenuDir", ++ "PYTHON|Python "+short_version+" (command line - "+bitted+")", ++ "python.exe", default_feature.id, None, None, None, ++ "python_icon.exe", 2, None, "TARGETDIR"), + # Advertising the Manual breaks on (some?) Win98, and the shortcut lacks an + # icon first. + #("Manual", "MenuDir", "MANUAL|Python Manuals", "documentation", + # htmlfiles.id, None, None, None, None, None, None, None), + ## Non-advertised shortcuts: must be associated with a registry component +- ("Manual", "MenuDir", "MANUAL|Python Manuals", "REGISTRY.doc", +- "[#%s]" % docfile, None, +- None, None, None, None, None, None), +- ("PyDoc", "MenuDir", "MODDOCS|Module Docs", "python.exe", +- default_feature.id, r'-m pydoc -b', None, None, "python_icon.exe", 0, None, "TARGETDIR"), +- ("Uninstall", "MenuDir", "UNINST|Uninstall Python", "REGISTRY", ++ ("Manual", "MenuDir", "MANUAL|Python "+short_version+" Manuals", ++ "REGISTRY.doc", "[#%s]" % docfile, ++ None, None, None, None, None, None, None), ++ ("PyDoc", "MenuDir", ++ "MODDOCS|Python "+short_version+" Docs Server (pydoc - "+ ++ bitted+")", "python.exe", default_feature.id, r'-m pydoc -b', ++ None, None, "python_icon.exe", 0, None, "TARGETDIR"), ++ ("Uninstall", "MenuDir", "UNINST|Uninstall Python "+ ++ short_version+" ("+bitted+")", "REGISTRY", + SystemFolderName+"msiexec", "/x%s" % product_code, + None, None, None, None, None, None), + ]) +diff -r c0e311e010fc Tools/parser/test_unparse.py +--- a/Tools/parser/test_unparse.py ++++ /dev/null +@@ -1,276 +0,0 @@ +-import unittest +-import test.support +-import io +-import os +-import random +-import tokenize +-import unparse +-import ast +- +-def read_pyfile(filename): +- """Read and return the contents of a Python source file (as a +- string), taking into account the file encoding.""" +- with open(filename, "rb") as pyfile: +- encoding = tokenize.detect_encoding(pyfile.readline)[0] +- with open(filename, "r", encoding=encoding) as pyfile: +- source = pyfile.read() +- return source +- +-for_else = """\ +-def f(): +- for x in range(10): +- break +- else: +- y = 2 +- z = 3 +-""" +- +-while_else = """\ +-def g(): +- while True: +- break +- else: +- y = 2 +- z = 3 +-""" +- +-relative_import = """\ +-from . import fred +-from .. import barney +-from .australia import shrimp as prawns +-""" +- +-nonlocal_ex = """\ +-def f(): +- x = 1 +- def g(): +- nonlocal x +- x = 2 +- y = 7 +- def h(): +- nonlocal x, y +-""" +- +-# also acts as test for 'except ... as ...' +-raise_from = """\ +-try: +- 1 / 0 +-except ZeroDivisionError as e: +- raise ArithmeticError from e +-""" +- +-class_decorator = """\ +-@f1(arg) +-@f2 +-class Foo: pass +-""" +- +-elif1 = """\ +-if cond1: +- suite1 +-elif cond2: +- suite2 +-else: +- suite3 +-""" +- +-elif2 = """\ +-if cond1: +- suite1 +-elif cond2: +- suite2 +-""" +- +-try_except_finally = """\ +-try: +- suite1 +-except ex1: +- suite2 +-except ex2: +- suite3 +-else: +- suite4 +-finally: +- suite5 +-""" +- +-with_simple = """\ +-with f(): +- suite1 +-""" +- +-with_as = """\ +-with f() as x: +- suite1 +-""" +- +-with_two_items = """\ +-with f() as x, g() as y: +- suite1 +-""" +- +-class ASTTestCase(unittest.TestCase): +- def assertASTEqual(self, ast1, ast2): +- self.assertEqual(ast.dump(ast1), ast.dump(ast2)) +- +- def check_roundtrip(self, code1, filename="internal"): +- ast1 = compile(code1, filename, "exec", ast.PyCF_ONLY_AST) +- unparse_buffer = io.StringIO() +- unparse.Unparser(ast1, unparse_buffer) +- code2 = unparse_buffer.getvalue() +- ast2 = compile(code2, filename, "exec", ast.PyCF_ONLY_AST) +- self.assertASTEqual(ast1, ast2) +- +-class UnparseTestCase(ASTTestCase): +- # Tests for specific bugs found in earlier versions of unparse +- +- def test_del_statement(self): +- self.check_roundtrip("del x, y, z") +- +- def test_shifts(self): +- self.check_roundtrip("45 << 2") +- self.check_roundtrip("13 >> 7") +- +- def test_for_else(self): +- self.check_roundtrip(for_else) +- +- def test_while_else(self): +- self.check_roundtrip(while_else) +- +- def test_unary_parens(self): +- self.check_roundtrip("(-1)**7") +- self.check_roundtrip("(-1.)**8") +- self.check_roundtrip("(-1j)**6") +- self.check_roundtrip("not True or False") +- self.check_roundtrip("True or not False") +- +- def test_integer_parens(self): +- self.check_roundtrip("3 .__abs__()") +- +- def test_huge_float(self): +- self.check_roundtrip("1e1000") +- self.check_roundtrip("-1e1000") +- self.check_roundtrip("1e1000j") +- self.check_roundtrip("-1e1000j") +- +- def test_min_int(self): +- self.check_roundtrip(str(-2**31)) +- self.check_roundtrip(str(-2**63)) +- +- def test_imaginary_literals(self): +- self.check_roundtrip("7j") +- self.check_roundtrip("-7j") +- self.check_roundtrip("0j") +- self.check_roundtrip("-0j") +- +- def test_lambda_parentheses(self): +- self.check_roundtrip("(lambda: int)()") +- +- def test_chained_comparisons(self): +- self.check_roundtrip("1 < 4 <= 5") +- self.check_roundtrip("a is b is c is not d") +- +- def test_function_arguments(self): +- self.check_roundtrip("def f(): pass") +- self.check_roundtrip("def f(a): pass") +- self.check_roundtrip("def f(b = 2): pass") +- self.check_roundtrip("def f(a, b): pass") +- self.check_roundtrip("def f(a, b = 2): pass") +- self.check_roundtrip("def f(a = 5, b = 2): pass") +- self.check_roundtrip("def f(*, a = 1, b = 2): pass") +- self.check_roundtrip("def f(*, a = 1, b): pass") +- self.check_roundtrip("def f(*, a, b = 2): pass") +- self.check_roundtrip("def f(a, b = None, *, c, **kwds): pass") +- self.check_roundtrip("def f(a=2, *args, c=5, d, **kwds): pass") +- self.check_roundtrip("def f(*args, **kwargs): pass") +- +- def test_relative_import(self): +- self.check_roundtrip(relative_import) +- +- def test_nonlocal(self): +- self.check_roundtrip(nonlocal_ex) +- +- def test_raise_from(self): +- self.check_roundtrip(raise_from) +- +- def test_bytes(self): +- self.check_roundtrip("b'123'") +- +- def test_annotations(self): +- self.check_roundtrip("def f(a : int): pass") +- self.check_roundtrip("def f(a: int = 5): pass") +- self.check_roundtrip("def f(*args: [int]): pass") +- self.check_roundtrip("def f(**kwargs: dict): pass") +- self.check_roundtrip("def f() -> None: pass") +- +- def test_set_literal(self): +- self.check_roundtrip("{'a', 'b', 'c'}") +- +- def test_set_comprehension(self): +- self.check_roundtrip("{x for x in range(5)}") +- +- def test_dict_comprehension(self): +- self.check_roundtrip("{x: x*x for x in range(10)}") +- +- def test_class_decorators(self): +- self.check_roundtrip(class_decorator) +- +- def test_class_definition(self): +- self.check_roundtrip("class A(metaclass=type, *[], **{}): pass") +- +- def test_elifs(self): +- self.check_roundtrip(elif1) +- self.check_roundtrip(elif2) +- +- def test_try_except_finally(self): +- self.check_roundtrip(try_except_finally) +- +- def test_starred_assignment(self): +- self.check_roundtrip("a, *b, c = seq") +- self.check_roundtrip("a, (*b, c) = seq") +- self.check_roundtrip("a, *b[0], c = seq") +- self.check_roundtrip("a, *(b, c) = seq") +- +- def test_with_simple(self): +- self.check_roundtrip(with_simple) +- +- def test_with_as(self): +- self.check_roundtrip(with_as) +- +- def test_with_two_items(self): +- self.check_roundtrip(with_two_items) +- +- +-class DirectoryTestCase(ASTTestCase): +- """Test roundtrip behaviour on all files in Lib and Lib/test.""" +- +- # test directories, relative to the root of the distribution +- test_directories = 'Lib', os.path.join('Lib', 'test') +- +- def test_files(self): +- # get names of files to test +- dist_dir = os.path.join(os.path.dirname(__file__), os.pardir, os.pardir) +- +- names = [] +- for d in self.test_directories: +- test_dir = os.path.join(dist_dir, d) +- for n in os.listdir(test_dir): +- if n.endswith('.py') and not n.startswith('bad'): +- names.append(os.path.join(test_dir, n)) +- +- # Test limited subset of files unless the 'cpu' resource is specified. +- if not test.support.is_resource_enabled("cpu"): +- names = random.sample(names, 10) +- +- for filename in names: +- if test.support.verbose: +- print('Testing %s' % filename) +- source = read_pyfile(filename) +- self.check_roundtrip(source) +- +- +-def test_main(): +- test.support.run_unittest(UnparseTestCase, DirectoryTestCase) +- +-if __name__ == '__main__': +- test_main() +diff -r c0e311e010fc Tools/scripts/md5sum.py +--- a/Tools/scripts/md5sum.py ++++ b/Tools/scripts/md5sum.py +@@ -9,7 +9,7 @@ + rmode = 'rb' + + usage = """ +-usage: sum5 [-b] [-t] [-l] [-s bufsize] [file ...] ++usage: md5sum.py [-b] [-t] [-l] [-s bufsize] [file ...] + -b : read files in binary mode (default) + -t : read files in text mode (you almost certainly don't want this!) + -l : print last pathname component only +@@ -17,6 +17,7 @@ + file ... : files to sum; '-' or no files means stdin + """ % bufsize + ++import io + import sys + import os + import getopt +@@ -24,7 +25,7 @@ + + def sum(*files): + sts = 0 +- if files and isinstance(files[-1], file): ++ if files and isinstance(files[-1], io.IOBase): + out, files = files[-1], files[:-1] + else: + out = sys.stdout +@@ -53,12 +54,14 @@ + return sts + + def printsumfp(fp, filename, out=sys.stdout): +- m = md5.new() ++ m = md5() + try: + while 1: + data = fp.read(bufsize) + if not data: + break ++ if isinstance(data, str): ++ data = data.encode(fp.encoding) + m.update(data) + except IOError as msg: + sys.stderr.write('%s: I/O error: %s\n' % (filename, msg)) +diff -r c0e311e010fc Tools/scripts/pydocgui.pyw +--- a/Tools/scripts/pydocgui.pyw ++++ /dev/null +@@ -1,7 +0,0 @@ +-# Note: this file must not be named pydoc.pyw, lest it just end up +-# importing itself (Python began allowing import of .pyw files +-# between 2.2a1 and 2.2a2). +-import pydoc +- +-if __name__ == '__main__': +- pydoc.gui() +diff -r c0e311e010fc configure.ac +--- a/configure.ac ++++ b/configure.ac +@@ -1318,10 +1318,16 @@ + # 4. If we are running on OS X 10.2 or earlier, good luck! + + AC_MSG_CHECKING(which MACOSX_DEPLOYMENT_TARGET to use) +- cur_target=`sw_vers -productVersion | sed 's/\(10\.[[0-9]]*\).*/\1/'` +- if test ${cur_target} '>' 10.2 && \ +- test ${cur_target} '<' 10.6 ++ cur_target_major=`sw_vers -productVersion | \ ++ sed 's/\([[0-9]]*\)\.\([[0-9]]*\).*/\1/'` ++ cur_target_minor=`sw_vers -productVersion | \ ++ sed 's/\([[0-9]]*\)\.\([[0-9]]*\).*/\2/'` ++ cur_target="${cur_target_major}.${cur_target_minor}" ++ if test ${cur_target_major} -eq 10 && \ ++ test ${cur_target_minor} -ge 3 && \ ++ test ${cur_target_minor} -le 5 + then ++ # OS X 10.3 through 10.5 + cur_target=10.3 + if test ${enable_universalsdk} + then +@@ -2017,12 +2023,14 @@ + # Use -undefined dynamic_lookup whenever possible (10.3 and later). + # This allows an extension to be used in any Python + +- if test ${MACOSX_DEPLOYMENT_TARGET} '>' 10.2 ++ dep_target_major=`echo ${MACOSX_DEPLOYMENT_TARGET} | \ ++ sed 's/\([[0-9]]*\)\.\([[0-9]]*\).*/\1/'` ++ dep_target_minor=`echo ${MACOSX_DEPLOYMENT_TARGET} | \ ++ sed 's/\([[0-9]]*\)\.\([[0-9]]*\).*/\2/'` ++ if test ${dep_target_major} -eq 10 && \ ++ test ${dep_target_minor} -le 2 + then +- LDSHARED='$(CC) -bundle -undefined dynamic_lookup' +- LDCXXSHARED='$(CXX) -bundle -undefined dynamic_lookup' +- BLDSHARED="$LDSHARED" +- else ++ # building for OS X 10.0 through 10.2 + LDSHARED='$(CC) -bundle' + LDCXXSHARED='$(CXX) -bundle' + if test "$enable_framework" ; then +@@ -2036,6 +2044,11 @@ + LDSHARED="$LDSHARED "'-bundle_loader $(BINDIR)/python$(VERSION)$(EXE)' + LDCXXSHARED="$LDCXXSHARED "'-bundle_loader $(BINDIR)/python$(VERSION)$(EXE)' + fi ++ else ++ # building for OS X 10.3 and later ++ LDSHARED='$(CC) -bundle -undefined dynamic_lookup' ++ LDCXXSHARED='$(CXX) -bundle -undefined dynamic_lookup' ++ BLDSHARED="$LDSHARED" + fi + ;; + Linux*|GNU*|QNX*) +diff -r c0e311e010fc setup.py +--- a/setup.py ++++ b/setup.py +@@ -697,7 +697,9 @@ + if host_platform == 'darwin': + os_release = int(os.uname()[2].split('.')[0]) + dep_target = sysconfig.get_config_var('MACOSX_DEPLOYMENT_TARGET') +- if dep_target and dep_target.split('.') < ['10', '5']: ++ if (dep_target and ++ (tuple(int(n) for n in dep_target.split('.')[0:2]) ++ < (10, 5) ) ): + os_release = 8 + if os_release < 9: + # MacOSX 10.4 has a broken readline. Don't try to build --- python3.4-3.4.1.orig/debian/patches/hurd-disable-nonworking-constants.diff +++ python3.4-3.4.1/debian/patches/hurd-disable-nonworking-constants.diff @@ -0,0 +1,38 @@ +# DP: Comment out constant exposed on the API which are not implemented on +# DP: GNU/Hurd. They would not work at runtime anyway. + +Index: b/Modules/socketmodule.c +=================================================================== +--- a/Modules/socketmodule.c ++++ b/Modules/socketmodule.c +@@ -6129,9 +6129,11 @@ + #ifdef SO_OOBINLINE + PyModule_AddIntMacro(m, SO_OOBINLINE); + #endif ++#ifndef __GNU__ + #ifdef SO_REUSEPORT + PyModule_AddIntMacro(m, SO_REUSEPORT); + #endif ++#endif + #ifdef SO_SNDBUF + PyModule_AddIntMacro(m, SO_SNDBUF); + #endif +Index: b/Modules/posixmodule.c +=================================================================== +--- a/Modules/posixmodule.c ++++ b/Modules/posixmodule.c +@@ -11724,12 +11724,14 @@ + #ifdef O_LARGEFILE + if (PyModule_AddIntMacro(m, O_LARGEFILE)) return -1; + #endif ++#ifndef __GNU__ + #ifdef O_SHLOCK + if (PyModule_AddIntMacro(m, O_SHLOCK)) return -1; + #endif + #ifdef O_EXLOCK + if (PyModule_AddIntMacro(m, O_EXLOCK)) return -1; + #endif ++#endif + #ifdef O_EXEC + if (PyModule_AddIntMacro(m, O_EXEC)) return -1; + #endif --- python3.4-3.4.1.orig/debian/patches/issue21264.diff +++ python3.4-3.4.1/debian/patches/issue21264.diff @@ -0,0 +1,28 @@ +# DP: Fix issue #21264, test_compileall test failures in the installed location + +--- a/Lib/test/test_compileall.py ++++ b/Lib/test/test_compileall.py +@@ -187,19 +187,19 @@ + os.utime(pycpath, (time.time()-60,)*2) + mtime = os.stat(pycpath).st_mtime + # Without force, no recompilation +- self.assertRunOK(PYTHONPATH=self.directory) ++ self.assertRunOK(self.directory) + mtime2 = os.stat(pycpath).st_mtime + self.assertEqual(mtime, mtime2) + # Now force it. +- self.assertRunOK('-f', PYTHONPATH=self.directory) ++ self.assertRunOK('-f', self.directory) + mtime2 = os.stat(pycpath).st_mtime + self.assertNotEqual(mtime, mtime2) + + def test_no_args_respects_quiet_flag(self): + script_helper.make_script(self.directory, 'baz', '') +- noisy = self.assertRunOK(PYTHONPATH=self.directory) ++ noisy = self.assertRunOK(self.directory) + self.assertIn(b'Listing ', noisy) +- quiet = self.assertRunOK('-q', PYTHONPATH=self.directory) ++ quiet = self.assertRunOK('-q', self.directory) + self.assertNotIn(b'Listing ', quiet) + + # Ensure that the default behavior of compileall's CLI is to create --- python3.4-3.4.1.orig/debian/patches/langpack-gettext.diff +++ python3.4-3.4.1/debian/patches/langpack-gettext.diff @@ -0,0 +1,36 @@ +# DP: Description: support alternative gettext tree in +# DP: /usr/share/locale-langpack; if a file is present in both trees, +# DP: prefer the newer one +# DP: Upstream status: Ubuntu-Specific + +Index: b/Lib/gettext.py +=================================================================== +--- a/Lib/gettext.py ++++ b/Lib/gettext.py +@@ -378,11 +378,26 @@ + if lang == 'C': + break + mofile = os.path.join(localedir, lang, 'LC_MESSAGES', '%s.mo' % domain) ++ mofile_lp = os.path.join("/usr/share/locale-langpack", lang, ++ 'LC_MESSAGES', '%s.mo' % domain) ++ ++ # first look into the standard locale dir, then into the ++ # langpack locale dir ++ ++ # standard mo file + if os.path.exists(mofile): + if all: + result.append(mofile) + else: + return mofile ++ ++ # langpack mofile -> use it ++ if os.path.exists(mofile_lp): ++ if all: ++ result.append(mofile_lp) ++ else: ++ return mofile_lp ++ + return result + + --- python3.4-3.4.1.orig/debian/patches/lib-argparse.diff +++ python3.4-3.4.1/debian/patches/lib-argparse.diff @@ -0,0 +1,22 @@ +# DP: argparse.py: Make the gettext import conditional + +--- a/Lib/argparse.py ++++ b/Lib/argparse.py +@@ -90,7 +90,16 @@ + import sys as _sys + import textwrap as _textwrap + +-from gettext import gettext as _, ngettext ++try: ++ from gettext import gettext as _, ngettext ++except ImportError: ++ def _(message): ++ return message ++ def ngettext(singular,plural,n): ++ if n == 1: ++ return singular ++ else: ++ return plural + + + SUPPRESS = '==SUPPRESS==' --- python3.4-3.4.1.orig/debian/patches/lib2to3-no-pickled-grammar.diff +++ python3.4-3.4.1/debian/patches/lib2to3-no-pickled-grammar.diff @@ -0,0 +1,14 @@ +--- a/Lib/lib2to3/pgen2/driver.py ++++ b/Lib/lib2to3/pgen2/driver.py +@@ -119,7 +119,10 @@ + if force or not _newer(gp, gt): + logger.info("Generating grammar tables from %s", gt) + g = pgen.generate_grammar(gt) +- if save: ++ # the pickle files mismatch, when built on different architectures. ++ # don't save these for now. An alternative solution might be to ++ # include the multiarch triplet into the file name ++ if False: + logger.info("Writing grammar tables to %s", gp) + try: + g.dump(gp) --- python3.4-3.4.1.orig/debian/patches/libffi-shared.diff +++ python3.4-3.4.1/debian/patches/libffi-shared.diff @@ -0,0 +1,13 @@ +Index: b/setup.py +=================================================================== +--- a/setup.py ++++ b/setup.py +@@ -1946,7 +1946,7 @@ class PyBuildExt(build_ext): + break + ffi_lib = None + if ffi_inc is not None: +- for lib_name in ('ffi_convenience', 'ffi_pic', 'ffi'): ++ for lib_name in ('ffi', 'ffi_convenience', 'ffi_pic', 'ffi'): + if (self.compiler.find_library_file(lib_dirs, lib_name)): + ffi_lib = lib_name + break --- python3.4-3.4.1.orig/debian/patches/link-opt.diff +++ python3.4-3.4.1/debian/patches/link-opt.diff @@ -0,0 +1,26 @@ +# DP: Call the linker with -O1 -Bsymbolic-functions + +Index: b/configure.ac +=================================================================== +--- a/configure.ac ++++ b/configure.ac +@@ -2039,8 +2039,8 @@ then + fi + ;; + Linux*|GNU*|QNX*) +- LDSHARED='$(CC) -shared' +- LDCXXSHARED='$(CXX) -shared';; ++ LDSHARED='$(CC) -shared -Wl,-O1 -Wl,-Bsymbolic-functions' ++ LDCXXSHARED='$(CXX) -shared -Wl,-O1 -Wl,-Bsymbolic-functions';; + BSD/OS*/4*) + LDSHARED="gcc -shared" + LDCXXSHARED="g++ -shared";; +@@ -2138,7 +2138,7 @@ then + LINKFORSHARED="-Wl,-E -Wl,+s";; + # LINKFORSHARED="-Wl,-E -Wl,+s -Wl,+b\$(BINLIBDEST)/lib-dynload";; + BSD/OS/4*) LINKFORSHARED="-Xlinker -export-dynamic";; +- Linux*|GNU*) LINKFORSHARED="-Xlinker -export-dynamic";; ++ Linux*|GNU*) LINKFORSHARED="-Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions";; + # -u libsys_s pulls in all symbols in libsys + Darwin/*) + LINKFORSHARED="$extra_undefs -framework CoreFoundation" --- python3.4-3.4.1.orig/debian/patches/link-timemodule.diff +++ python3.4-3.4.1/debian/patches/link-timemodule.diff @@ -0,0 +1,13 @@ +Index: b/Modules/Setup.dist +=================================================================== +--- a/Modules/Setup.dist ++++ b/Modules/Setup.dist +@@ -171,7 +171,7 @@ + #cmath cmathmodule.c _math.c # -lm # complex math library functions + #math mathmodule.c _math.c # -lm # math library functions, e.g. sin() + #_struct _struct.c # binary structure packing/unpacking +-#time timemodule.c # -lm # time operations and variables ++#time timemodule.c -lrt # -lm # time operations and variables + #_weakref _weakref.c # basic weak reference support + #_testcapi _testcapimodule.c # Python C API test module + #_random _randommodule.c # Random number generator --- python3.4-3.4.1.orig/debian/patches/locale-module.diff +++ python3.4-3.4.1/debian/patches/locale-module.diff @@ -0,0 +1,19 @@ +# DP: * Lib/locale.py: +# DP: - Don't map 'utf8', 'utf-8' to 'utf', which is not a known encoding +# DP: for glibc. + +Index: b/Lib/locale.py +=================================================================== +--- a/Lib/locale.py ++++ b/Lib/locale.py +@@ -1222,8 +1222,8 @@ + 'turkish': 'tr_TR.ISO8859-9', + 'uk': 'uk_UA.KOI8-U', + 'uk_ua': 'uk_UA.KOI8-U', +- 'univ': 'en_US.utf', +- 'universal': 'en_US.utf', ++ 'univ': 'en_US.UTF-8', ++ 'universal': 'en_US.UTF-8', + 'universal.utf8@ucs4': 'en_US.UTF-8', + 'ur': 'ur_PK.CP1256', + 'ur_in': 'ur_IN.UTF-8', --- python3.4-3.4.1.orig/debian/patches/lto-link-flags.diff +++ python3.4-3.4.1/debian/patches/lto-link-flags.diff @@ -0,0 +1,22 @@ +Index: b/Makefile.pre.in +=================================================================== +--- a/Makefile.pre.in ++++ b/Makefile.pre.in +@@ -128,7 +128,7 @@ CONFINCLUDEPY= $(CONFINCLUDEDIR)/python$ + SHLIB_SUFFIX= @SHLIB_SUFFIX@ + EXT_SUFFIX= @EXT_SUFFIX@ + LDSHARED= @LDSHARED@ $(PY_LDFLAGS) +-BLDSHARED= @BLDSHARED@ $(PY_LDFLAGS) ++BLDSHARED= @BLDSHARED@ $(PY_LDFLAGS) $(PY_CFLAGS) + LDCXXSHARED= @LDCXXSHARED@ + DESTSHARED= $(BINLIBDEST)/lib-dynload + +@@ -542,7 +542,7 @@ clinic: $(BUILDPYTHON) + + # Build the interpreter + $(BUILDPYTHON): Modules/python.o $(LIBRARY) $(LDLIBRARY) $(PY3LIBRARY) +- $(LINKCC) $(PY_LDFLAGS) $(LINKFORSHARED) -o $@ Modules/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST) ++ $(LINKCC) $(PY_LDFLAGS) $(PY_CFLAGS) $(LINKFORSHARED) -o $@ Modules/python.o $(BLDLIBRARY) $(LIBS) $(MODLIBS) $(SYSLIBS) $(LDLAST) + + platform: $(BUILDPYTHON) pybuilddir.txt + $(RUNSHARED) $(PYTHON_FOR_BUILD) -c 'import sys ; from sysconfig import get_platform ; print(get_platform()+"-"+sys.version[0:3])' >platform --- python3.4-3.4.1.orig/debian/patches/makesetup-bashism.diff +++ python3.4-3.4.1/debian/patches/makesetup-bashism.diff @@ -0,0 +1,15 @@ +# DP: Fix bashism in makesetup shell script + +Index: b/Modules/makesetup +=================================================================== +--- a/Modules/makesetup ++++ b/Modules/makesetup +@@ -277,7 +277,7 @@ sed -e 's/[ ]*#.*//' -e '/^[ ]*$/d' | + -) ;; + *) sedf="@sed.in.$$" + trap 'rm -f $sedf' 0 1 2 3 +- echo "1i\\" >$sedf ++ printf "1i\\" >$sedf + str="# Generated automatically from $makepre by makesetup." + echo "$str" >>$sedf + echo "s%_MODOBJS_%$OBJS%" >>$sedf --- python3.4-3.4.1.orig/debian/patches/multiarch-extname.diff +++ python3.4-3.4.1/debian/patches/multiarch-extname.diff @@ -0,0 +1,84 @@ +Index: b/Lib/distutils/dir_util.py +=================================================================== +--- a/Lib/distutils/dir_util.py ++++ b/Lib/distutils/dir_util.py +@@ -96,6 +96,9 @@ + for dir in sorted(need_dir): + mkpath(dir, mode, verbose=verbose, dry_run=dry_run) + ++import sysconfig ++_multiarch = None ++ + def copy_tree(src, dst, preserve_mode=1, preserve_times=1, + preserve_symlinks=0, update=0, verbose=1, dry_run=0): + """Copy an entire directory tree 'src' to a new location 'dst'. +@@ -132,6 +135,9 @@ + raise DistutilsFileError( + "error listing files in '%s': %s" % (src, errstr)) + ++ ext_suffix = sysconfig.get_config_var ('EXT_SUFFIX') ++ new_suffix = "%s-%s%s" % (ext_suffix[:-3], _multiarch, ext_suffix[-3:]) ++ + if not dry_run: + mkpath(dst, verbose=verbose) + +@@ -140,6 +146,9 @@ + for n in names: + src_name = os.path.join(src, n) + dst_name = os.path.join(dst, n) ++ if _multiarch and n.endswith(ext_suffix) and not n.endswith(new_suffix): ++ dst_name = os.path.join(dst, n.replace(ext_suffix, new_suffix)) ++ log.info("renaming extension %s -> %s", n, n.replace(ext_suffix, new_suffix)) + + if n.startswith('.nfs'): + # skip NFS rename files +Index: b/Lib/distutils/command/install_lib.py +=================================================================== +--- a/Lib/distutils/command/install_lib.py ++++ b/Lib/distutils/command/install_lib.py +@@ -56,6 +56,7 @@ + self.compile = None + self.optimize = None + self.skip_build = None ++ self.multiarch = None # if we should rename the extensions + + def finalize_options(self): + # Get all the information we need to install pure Python modules +@@ -68,6 +69,7 @@ + ('compile', 'compile'), + ('optimize', 'optimize'), + ('skip_build', 'skip_build'), ++ ('multiarch', 'multiarch'), + ) + + if self.compile is None: +@@ -108,6 +110,8 @@ + + def install(self): + if os.path.isdir(self.build_dir): ++ import distutils.dir_util ++ distutils.dir_util._multiarch = self.multiarch + outfiles = self.copy_tree(self.build_dir, self.install_dir) + else: + self.warn("'%s' does not exist -- no Python modules to install" % +Index: b/Lib/distutils/command/install.py +=================================================================== +--- a/Lib/distutils/command/install.py ++++ b/Lib/distutils/command/install.py +@@ -207,6 +207,7 @@ + + # enable custom installation, known values: deb + self.install_layout = None ++ self.multiarch = None + + self.compile = None + self.optimize = None +@@ -464,6 +465,8 @@ + self.install_platbase = self.exec_prefix + if self.install_layout: + if self.install_layout.lower() in ['deb']: ++ import sysconfig ++ self.multiarch = sysconfig.get_config_var('MULTIARCH') + self.select_scheme("deb_system") + elif self.install_layout.lower() in ['unix']: + self.select_scheme("unix_prefix") --- python3.4-3.4.1.orig/debian/patches/multiarch.diff +++ python3.4-3.4.1/debian/patches/multiarch.diff @@ -0,0 +1,162 @@ +Index: b/Lib/sysconfig.py +=================================================================== +--- a/Lib/sysconfig.py ++++ b/Lib/sysconfig.py +@@ -339,6 +339,8 @@ def get_makefile_filename(): + config_dir_name = 'config-%s%s' % (_PY_VERSION_SHORT, sys.abiflags) + else: + config_dir_name = 'config' ++ if hasattr(sys.implementation, '_multiarch'): ++ config_dir_name += '-%s' % sys.implementation._multiarch + return os.path.join(get_path('stdlib'), config_dir_name, 'Makefile') + + def _generate_posix_vars(): +@@ -545,6 +547,12 @@ def get_config_vars(*args): + # the init-function. + _CONFIG_VARS['userbase'] = _getuserbase() + ++ multiarch = get_config_var('MULTIARCH') ++ if multiarch: ++ _CONFIG_VARS['multiarchsubdir'] = '/' + multiarch ++ else: ++ _CONFIG_VARS['multiarchsubdir'] = '' ++ + # Always convert srcdir to an absolute path + srcdir = _CONFIG_VARS.get('srcdir', _PROJECT_BASE) + if os.name == 'posix': +Index: b/Lib/distutils/sysconfig.py +=================================================================== +--- a/Lib/distutils/sysconfig.py ++++ b/Lib/distutils/sysconfig.py +@@ -111,6 +111,9 @@ def get_python_inc(plat_specific=0, pref + incdir = os.path.join(get_config_var('srcdir'), 'Include') + return os.path.normpath(incdir) + python_dir = 'python' + get_python_version() + build_flags ++ if not python_build and plat_specific: ++ import sysconfig ++ return sysconfig.get_path('platinclude') + return os.path.join(prefix, "include", python_dir) + elif os.name == "nt": + return os.path.join(prefix, "include") +@@ -275,6 +278,8 @@ def get_makefile_filename(): + return os.path.join(_sys_home or project_base, "Makefile") + lib_dir = get_python_lib(plat_specific=0, standard_lib=1) + config_file = 'config-{}{}'.format(get_python_version(), build_flags) ++ if hasattr(sys.implementation, '_multiarch'): ++ config_file += '-%s' % sys.implementation._multiarch + return os.path.join(lib_dir, config_file, 'Makefile') + + +Index: b/Makefile.pre.in +=================================================================== +--- a/Makefile.pre.in ++++ b/Makefile.pre.in +@@ -724,6 +724,7 @@ Modules/signalmodule.o: $(srcdir)/Module + + Python/dynload_shlib.o: $(srcdir)/Python/dynload_shlib.c Makefile + $(CC) -c $(PY_CORE_CFLAGS) \ ++ $(if $(MULTIARCH),-DMULTIARCH='"$(MULTIARCH)"') \ + -DSOABI='"$(SOABI)"' \ + -o $@ $(srcdir)/Python/dynload_shlib.c + +@@ -735,6 +736,7 @@ Python/dynload_hpux.o: $(srcdir)/Python/ + Python/sysmodule.o: $(srcdir)/Python/sysmodule.c Makefile + $(CC) -c $(PY_CORE_CFLAGS) \ + -DABIFLAGS='"$(ABIFLAGS)"' \ ++ -DMULTIARCH='"$(MULTIARCH)"' \ + -o $@ $(srcdir)/Python/sysmodule.c + + $(IO_OBJS): $(IO_H) +@@ -1116,7 +1118,7 @@ maninstall: altmaninstall + (cd $(DESTDIR)$(MANDIR)/man1; $(LN) -s python$(VERSION).1 python3.1) + + # Install the library +-PLATDIR= plat-$(MACHDEP) ++PLATDIR= plat-$(MULTIARCH) + EXTRAPLATDIR= @EXTRAPLATDIR@ + MACHDEPS= $(PLATDIR) $(EXTRAPLATDIR) + XMLLIBSUBDIRS= xml xml/dom xml/etree xml/parsers xml/sax +@@ -1257,6 +1259,10 @@ libinstall: build_all $(srcdir)/Lib/$(PL + $(srcdir)/Lib/$(PLATDIR): + mkdir $(srcdir)/Lib/$(PLATDIR) + cp $(srcdir)/Lib/plat-generic/regen $(srcdir)/Lib/$(PLATDIR)/regen ++ if [ -n "$(MULTIARCH)" ]; then \ ++ cp -p $(srcdir)/Lib/plat-linux/*.py $(srcdir)/Lib/$(PLATDIR)/.; \ ++ rm -f $(srcdir)/Lib/$(PLATDIR)/IN.py; \ ++ fi + export PATH; PATH="`pwd`:$$PATH"; \ + export PYTHONPATH; PYTHONPATH="`pwd`/Lib"; \ + export DYLD_FRAMEWORK_PATH; DYLD_FRAMEWORK_PATH="`pwd`"; \ +@@ -1304,10 +1310,10 @@ inclinstall: + + # Install the library and miscellaneous stuff needed for extending/embedding + # This goes into $(exec_prefix) +-LIBPL= $(LIBDEST)/config-$(LDVERSION) ++LIBPL= $(LIBDEST)/config-$(LDVERSION)-$(MULTIARCH) + + # pkgconfig directory +-LIBPC= $(LIBDIR)/pkgconfig ++LIBPC= $(LIBDIR)/$(MULTIARCH)/pkgconfig + + libainstall: all python-config + @for i in $(LIBDIR) $(LIBPL) $(LIBPC); \ +Index: b/Python/dynload_shlib.c +=================================================================== +--- a/Python/dynload_shlib.c ++++ b/Python/dynload_shlib.c +@@ -36,6 +36,9 @@ const char *_PyImport_DynLoadFiletab[] = + #ifdef __CYGWIN__ + ".dll", + #else /* !__CYGWIN__ */ ++#ifdef MULTIARCH ++ "." SOABI "-" MULTIARCH ".so", ++#endif + "." SOABI ".so", + ".abi" PYTHON_ABI_STRING ".so", + ".so", +Index: b/Modules/Setup.dist +=================================================================== +--- a/Modules/Setup.dist ++++ b/Modules/Setup.dist +@@ -91,7 +91,7 @@ SITEPATH= + TESTPATH= + + # Path components for machine- or system-dependent modules and shared libraries +-MACHDEPPATH=:plat-$(MACHDEP) ++MACHDEPPATH=:plat-$(MULTIARCH) + EXTRAMACHDEPPATH= + + COREPYTHONPATH=$(DESTPATH)$(SITEPATH)$(TESTPATH)$(MACHDEPPATH)$(EXTRAMACHDEPPATH) +Index: b/Python/sysmodule.c +=================================================================== +--- a/Python/sysmodule.c ++++ b/Python/sysmodule.c +@@ -1595,6 +1595,15 @@ make_impl_info(PyObject *version_info) + if (res < 0) + goto error; + ++ /* For Debian multiarch support. */ ++ value = PyUnicode_FromString(MULTIARCH); ++ if (value == NULL) ++ goto error; ++ res = PyDict_SetItemString(impl_info, "_multiarch", value); ++ Py_DECREF(value); ++ if (res < 0) ++ goto error; ++ + /* dict ready */ + + ns = _PyNamespace_New(impl_info); +Index: b/configure.ac +=================================================================== +--- a/configure.ac ++++ b/configure.ac +@@ -4098,7 +4098,7 @@ AC_MSG_RESULT($LDVERSION) + + dnl define LIBPL after ABIFLAGS and LDVERSION is defined. + AC_SUBST(PY_ENABLE_SHARED) +-LIBPL="${prefix}/lib/python${VERSION}/config-${LDVERSION}" ++LIBPL="${prefix}/lib/python${VERSION}/config-${LDVERSION}-${MULTIARCH}" + AC_SUBST(LIBPL) + + # Check whether right shifting a negative integer extends the sign bit --- python3.4-3.4.1.orig/debian/patches/no-large-file-support.diff +++ python3.4-3.4.1/debian/patches/no-large-file-support.diff @@ -0,0 +1,14 @@ +# DP: disable large file support for GNU/Hurd + +--- a/configure.ac ++++ b/configure.ac +@@ -1402,6 +1402,9 @@ + use_lfs=no + fi + ++# Don't use largefile support anyway. ++use_lfs=no ++ + if test "$use_lfs" = "yes"; then + # Two defines needed to enable largefile support on various platforms + # These may affect some typedefs --- python3.4-3.4.1.orig/debian/patches/no-zip-on-sys.path.diff +++ python3.4-3.4.1/debian/patches/no-zip-on-sys.path.diff @@ -0,0 +1,124 @@ +# DP: Do not add /usr/lib/pythonXY.zip on sys.path. + +Index: b/Modules/getpath.c +=================================================================== +--- a/Modules/getpath.c ++++ b/Modules/getpath.c +@@ -470,7 +470,9 @@ calculate_path(void) + wchar_t *path = NULL; + wchar_t *prog = Py_GetProgramName(); + wchar_t argv0_path[MAXPATHLEN+1]; ++#ifdef WITH_ZIP_PATH + wchar_t zip_path[MAXPATHLEN+1]; ++#endif + int pfound, efound; /* 1 if found; -1 if found build directory */ + wchar_t *buf; + size_t bufsz; +@@ -675,6 +677,7 @@ calculate_path(void) + else + reduce(prefix); + ++#ifdef WITH_ZIP_PATH + wcsncpy(zip_path, prefix, MAXPATHLEN); + zip_path[MAXPATHLEN] = L'\0'; + if (pfound > 0) { /* Use the reduced prefix returned by Py_GetPrefix() */ +@@ -687,6 +690,7 @@ calculate_path(void) + bufsz = wcslen(zip_path); /* Replace "00" with version */ + zip_path[bufsz - 6] = VERSION[0]; + zip_path[bufsz - 5] = VERSION[2]; ++#endif + + efound = search_for_exec_prefix(argv0_path, home, + _exec_prefix, lib_python); +@@ -732,7 +736,9 @@ calculate_path(void) + defpath = delim + 1; + } + ++#ifdef WITH_ZIP_PATH + bufsz += wcslen(zip_path) + 1; ++#endif + bufsz += wcslen(exec_prefix) + 1; + /* When running from the build directory, add room for the Modules + * subdirectory too. +@@ -754,9 +760,11 @@ calculate_path(void) + else + buf[0] = '\0'; + ++#ifdef WITH_ZIP_PATH + /* Next is the default zip path */ + wcscat(buf, zip_path); + wcscat(buf, delimiter); ++#endif + + /* Next goes merge of compile-time $PYTHONPATH with + * dynamically located prefix. +Index: b/Lib/test/test_cmd_line_script.py +=================================================================== +--- a/Lib/test/test_cmd_line_script.py ++++ b/Lib/test/test_cmd_line_script.py +@@ -256,11 +256,6 @@ class CmdLineTest(unittest.TestCase): + script_dir, '', + importlib.machinery.SourcelessFileLoader) + +- def test_directory_error(self): +- with temp_dir() as script_dir: +- msg = "can't find '__main__' module in %r" % script_dir +- self._check_import_error(script_dir, msg) +- + def test_zipfile(self): + with temp_dir() as script_dir: + script_name = _make_test_script(script_dir, '__main__') +@@ -276,13 +271,6 @@ class CmdLineTest(unittest.TestCase): + self._check_script(zip_name, run_name, zip_name, zip_name, '', + zipimport.zipimporter) + +- def test_zipfile_error(self): +- with temp_dir() as script_dir: +- script_name = _make_test_script(script_dir, 'not_main') +- zip_name, run_name = make_zip_script(script_dir, 'test_zip', script_name) +- msg = "can't find '__main__' module in %r" % zip_name +- self._check_import_error(zip_name, msg) +- + def test_module_in_package(self): + with temp_dir() as script_dir: + pkg_dir = os.path.join(script_dir, 'test_pkg') +Index: b/Lib/test/test_zipimport_support.py +=================================================================== +--- a/Lib/test/test_zipimport_support.py ++++ b/Lib/test/test_zipimport_support.py +@@ -185,35 +185,6 @@ class ZipSupportTests(unittest.TestCase) + finally: + del sys.modules["test_zipped_doctest"] + +- def test_doctest_main_issue4197(self): +- test_src = textwrap.dedent("""\ +- class Test: +- ">>> 'line 2'" +- pass +- +- import doctest +- doctest.testmod() +- """) +- pattern = 'File "%s", line 2, in %s' +- with temp_dir() as d: +- script_name = make_script(d, 'script', test_src) +- rc, out, err = assert_python_ok(script_name) +- expected = pattern % (script_name, "__main__.Test") +- if verbose: +- print ("Expected line", expected) +- print ("Got stdout:") +- print (ascii(out)) +- self.assertIn(expected.encode('utf-8'), out) +- zip_name, run_name = make_zip_script(d, "test_zip", +- script_name, '__main__.py') +- rc, out, err = assert_python_ok(zip_name) +- expected = pattern % (run_name, "__main__.Test") +- if verbose: +- print ("Expected line", expected) +- print ("Got stdout:") +- print (ascii(out)) +- self.assertIn(expected.encode('utf-8'), out) +- + def test_pdb_issue4201(self): + test_src = textwrap.dedent("""\ + def f(): --- python3.4-3.4.1.orig/debian/patches/platform-lsbrelease.diff +++ python3.4-3.4.1/debian/patches/platform-lsbrelease.diff @@ -0,0 +1,77 @@ +# DP: Use /etc/lsb-release to identify the platform. + +Index: b/Lib/platform.py +=================================================================== +--- a/Lib/platform.py ++++ b/Lib/platform.py +@@ -265,7 +265,7 @@ + _supported_dists = ( + 'SuSE', 'debian', 'fedora', 'redhat', 'centos', + 'mandrake', 'mandriva', 'rocks', 'slackware', 'yellowdog', 'gentoo', +- 'UnitedLinux', 'turbolinux', 'arch', 'mageia') ++ 'UnitedLinux', 'turbolinux', 'arch', 'mageia', 'Ubuntu') + + def _parse_release_file(firstline): + +@@ -294,6 +294,10 @@ + id = l[1] + return '', version, id + ++_distributor_id_file_re = re.compile("(?:DISTRIB_ID\s*=)\s*(.*)", re.I) ++_release_file_re = re.compile("(?:DISTRIB_RELEASE\s*=)\s*(.*)", re.I) ++_codename_file_re = re.compile("(?:DISTRIB_CODENAME\s*=)\s*(.*)", re.I) ++ + def linux_distribution(distname='', version='', id='', + + supported_dists=_supported_dists, +@@ -318,6 +322,25 @@ + args given as parameters. + + """ ++ # check for the Debian/Ubuntu /etc/lsb-release file first, needed so ++ # that the distribution doesn't get identified as Debian. ++ try: ++ with open("/etc/lsb-release", "r") as etclsbrel: ++ for line in etclsbrel: ++ m = _distributor_id_file_re.search(line) ++ if m: ++ _u_distname = m.group(1).strip() ++ m = _release_file_re.search(line) ++ if m: ++ _u_version = m.group(1).strip() ++ m = _codename_file_re.search(line) ++ if m: ++ _u_id = m.group(1).strip() ++ if _u_distname and _u_version: ++ return (_u_distname, _u_version, _u_id) ++ except (EnvironmentError, UnboundLocalError): ++ pass ++ + try: + etc = os.listdir(_UNIXCONFDIR) + except OSError: +Index: b/Lib/test/test_platform.py +=================================================================== +--- a/Lib/test/test_platform.py ++++ b/Lib/test/test_platform.py +@@ -297,20 +297,6 @@ + returncode = ret >> 8 + self.assertEqual(returncode, len(data)) + +- def test_linux_distribution_encoding(self): +- # Issue #17429 +- with tempfile.TemporaryDirectory() as tempdir: +- filename = os.path.join(tempdir, 'fedora-release') +- with open(filename, 'w', encoding='utf-8') as f: +- f.write('Fedora release 19 (Schr\xf6dinger\u2019s Cat)\n') +- +- with mock.patch('platform._UNIXCONFDIR', tempdir): +- distname, version, distid = platform.linux_distribution() +- +- self.assertEqual(distname, 'Fedora') +- self.assertEqual(version, '19') +- self.assertEqual(distid, 'Schr\xf6dinger\u2019s Cat') +- + def test_main(): + support.run_unittest( + PlatformTest --- python3.4-3.4.1.orig/debian/patches/profiled-build.diff +++ python3.4-3.4.1/debian/patches/profiled-build.diff @@ -0,0 +1,24 @@ +# DP: Ignore errors in the profile task. + +Index: b/Makefile.pre.in +=================================================================== +--- a/Makefile.pre.in ++++ b/Makefile.pre.in +@@ -478,7 +478,16 @@ build_all_generate_profile: + + run_profile_task: + : # FIXME: can't run for a cross build +- $(RUNSHARED) ./$(BUILDPYTHON) $(PROFILE_TASK) ++ task="$(PROFILE_TASK)"; \ ++ case "$$task" in \ ++ *-s\ *) \ ++ $(RUNSHARED) ./$(BUILDPYTHON) $$task; \ ++ while [ -f $(srcdir)/build/pynexttest ]; do \ ++ $(RUNSHARED) ./$(BUILDPYTHON) $$task; \ ++ done;; \ ++ *) \ ++ $(RUNSHARED) ./$(BUILDPYTHON) $$task; \ ++ esac + + build_all_use_profile: + $(MAKE) all CFLAGS="$(CFLAGS) -fprofile-use -fprofile-correction" --- python3.4-3.4.1.orig/debian/patches/revert-r83234.diff +++ python3.4-3.4.1/debian/patches/revert-r83234.diff @@ -0,0 +1,227 @@ +--- a/Doc/conf.py ++++ b/Doc/conf.py +@@ -13,7 +13,7 @@ + # --------------------- + + extensions = ['sphinx.ext.refcounting', 'sphinx.ext.coverage', +- 'sphinx.ext.doctest', 'pyspecific'] ++ 'sphinx.ext.doctest'] + templates_path = ['tools/sphinxext'] + + # General substitutions. +--- a/Doc/tools/sphinxext/pyspecific.py ++++ b/Doc/tools/sphinxext/pyspecific.py +@@ -84,32 +84,6 @@ + return [pnode] + + +-# Support for documenting decorators +- +-from sphinx import addnodes +-from sphinx.domains.python import PyModulelevel, PyClassmember +- +-class PyDecoratorMixin(object): +- def handle_signature(self, sig, signode): +- ret = super(PyDecoratorMixin, self).handle_signature(sig, signode) +- signode.insert(0, addnodes.desc_addname('@', '@')) +- return ret +- +- def needs_arglist(self): +- return False +- +-class PyDecoratorFunction(PyDecoratorMixin, PyModulelevel): +- def run(self): +- # a decorator function is a function after all +- self.name = 'py:function' +- return PyModulelevel.run(self) +- +-class PyDecoratorMethod(PyDecoratorMixin, PyClassmember): +- def run(self): +- self.name = 'py:method' +- return PyClassmember.run(self) +- +- + # Support for documenting version of removal in deprecations + + from sphinx.locale import versionlabels +@@ -227,6 +201,7 @@ + # Support for documenting Opcodes + + import re ++from sphinx import addnodes + + opcode_sig_re = re.compile(r'(\w+(?:\+\d)?)(?:\s*\((.*)\))?') + +@@ -280,5 +255,3 @@ + app.add_description_unit('pdbcommand', 'pdbcmd', '%s (pdb command)', + parse_pdb_command) + app.add_description_unit('2to3fixer', '2to3fixer', '%s (2to3 fixer)') +- app.add_directive_to_domain('py', 'decorator', PyDecoratorFunction) +- app.add_directive_to_domain('py', 'decoratormethod', PyDecoratorMethod) +--- a/Doc/library/contextlib.rst ++++ b/Doc/library/contextlib.rst +@@ -15,7 +15,7 @@ + Functions provided: + + +-.. decorator:: contextmanager ++.. function:: contextmanager(func) + + This function is a :term:`decorator` that can be used to define a factory + function for :keyword:`with` statement context managers, without needing to +--- a/Doc/library/abc.rst ++++ b/Doc/library/abc.rst +@@ -126,7 +126,7 @@ + + It also provides the following decorators: + +-.. decorator:: abstractmethod(function) ++.. function:: abstractmethod(function) + + A decorator indicating abstract methods. + +--- a/Doc/library/unittest.rst ++++ b/Doc/library/unittest.rst +@@ -666,20 +666,20 @@ + + The following decorators implement test skipping and expected failures: + +-.. decorator:: skip(reason) ++.. function:: skip(reason) + + Unconditionally skip the decorated test. *reason* should describe why the + test is being skipped. + +-.. decorator:: skipIf(condition, reason) ++.. function:: skipIf(condition, reason) + + Skip the decorated test if *condition* is true. + +-.. decorator:: skipUnless(condition, reason) ++.. function:: skipUnless(condition, reason) + + Skip the decorated test unless *condition* is true. + +-.. decorator:: expectedFailure ++.. function:: expectedFailure + + Mark the test as an expected failure. If the test fails when run, the test + is not counted as a failure. +@@ -973,11 +973,11 @@ + :attr:`exception` attribute. This can be useful if the intention + is to perform additional checks on the exception raised:: + +- with self.assertRaises(SomeException) as cm: +- do_something() ++ with self.assertRaises(SomeException) as cm: ++ do_something() + +- the_exception = cm.exception +- self.assertEqual(the_exception.error_code, 3) ++ the_exception = cm.exception ++ self.assertEqual(the_exception.error_code, 3) + + .. versionchanged:: 3.1 + Added the ability to use :meth:`assertRaises` as a context manager. +--- a/Doc/library/importlib.rst ++++ b/Doc/library/importlib.rst +@@ -469,7 +469,7 @@ + This module contains the various objects that help in the construction of + an :term:`importer`. + +-.. decorator:: module_for_loader ++.. function:: module_for_loader(method) + + A :term:`decorator` for a :term:`loader` method, + to handle selecting the proper +@@ -494,7 +494,7 @@ + Use of this decorator handles all the details of which module object a + loader should initialize as specified by :pep:`302`. + +-.. decorator:: set_loader ++.. function:: set_loader(fxn) + + A :term:`decorator` for a :term:`loader` method, + to set the :attr:`__loader__` +@@ -502,7 +502,7 @@ + does nothing. It is assumed that the first positional argument to the + wrapped method is what :attr:`__loader__` should be set to. + +-.. decorator:: set_package ++.. function:: set_package(fxn) + + A :term:`decorator` for a :term:`loader` to set the :attr:`__package__` + attribute on the module returned by the loader. If :attr:`__package__` is +--- a/Doc/library/functools.rst ++++ b/Doc/library/functools.rst +@@ -111,7 +111,7 @@ + + .. versionadded:: 3.2 + +-.. decorator:: total_ordering ++.. function:: total_ordering(cls) + + Given a class defining one or more rich comparison ordering methods, this + class decorator supplies the rest. This simplifies the effort involved +@@ -217,7 +217,7 @@ + Missing attributes no longer trigger an :exc:`AttributeError`. + + +-.. decorator:: wraps(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES) ++.. function:: wraps(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES) + + This is a convenience function for invoking ``partial(update_wrapper, + wrapped=wrapped, assigned=assigned, updated=updated)`` as a function decorator +--- a/Doc/documenting/markup.rst ++++ b/Doc/documenting/markup.rst +@@ -177,37 +177,6 @@ + are modified), side effects, and possible exceptions. A small example may be + provided. + +-.. describe:: decorator +- +- Describes a decorator function. The signature should *not* represent the +- signature of the actual function, but the usage as a decorator. For example, +- given the functions +- +- .. code-block:: python +- +- def removename(func): +- func.__name__ = '' +- return func +- +- def setnewname(name): +- def decorator(func): +- func.__name__ = name +- return func +- return decorator +- +- the descriptions should look like this:: +- +- .. decorator:: removename +- +- Remove name of the decorated function. +- +- .. decorator:: setnewname(name) +- +- Set name of the decorated function to *name*. +- +- There is no ``deco`` role to link to a decorator that is marked up with +- this directive; rather, use the ``:func:`` role. +- + .. describe:: class + + Describes a class. The signature can include parentheses with parameters +@@ -225,12 +194,6 @@ + parameter. The description should include similar information to that + described for ``function``. + +-.. describe:: decoratormethod +- +- Same as ``decorator``, but for decorators that are methods. +- +- Refer to a decorator method using the ``:meth:`` role. +- + .. describe:: opcode + + Describes a Python :term:`bytecode` instruction. --- python3.4-3.4.1.orig/debian/patches/revert-r83274.diff +++ python3.4-3.4.1/debian/patches/revert-r83274.diff @@ -0,0 +1,12 @@ +--- a/Doc/conf.py ++++ b/Doc/conf.py +@@ -65,9 +65,6 @@ + # Options for HTML output + # ----------------------- + +-html_theme = 'default' +-html_theme_options = {'collapsiblesidebar': True} +- + # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, + # using the given strftime format. + html_last_updated_fmt = '%b %d, %Y' --- python3.4-3.4.1.orig/debian/patches/series.in +++ python3.4-3.4.1/debian/patches/series.in @@ -0,0 +1,46 @@ +hg-updates.diff +deb-setup.diff +deb-locations.diff +site-locations.diff +distutils-install-layout.diff +locale-module.diff +distutils-link.diff +distutils-sysconfig.diff +tkinter-import.diff +gdbm-import.diff +link-opt.diff +setup-modules.diff +platform-lsbrelease.diff +bdist-wininst-notfound.diff +no-zip-on-sys.path.diff +profiled-build.diff +makesetup-bashism.diff +hurd-disable-nonworking-constants.diff +enable-fpectl.diff +#if defined (Ubuntu) +langpack-gettext.diff +#endif +#if defined (arch_os_hurd) +no-large-file-support.diff +#endif +#ifdef OLD_SPHINX +doc-build.diff +revert-r83234.diff +revert-r83274.diff +#endif +disable-sem-check.diff +lib-argparse.diff +ctypes-arm.diff +link-timemodule.diff +lto-link-flags.diff +libffi-shared.diff +multiarch.diff +distutils-init.diff +lib2to3-no-pickled-grammar.diff +ext-no-libpython-link.diff +test-no-random-order.diff +multiarch-extname.diff +tempfile-minimal.diff +disable-some-tests.diff +issue21264.diff +ensurepip-wheels.diff --- python3.4-3.4.1.orig/debian/patches/setup-modules.diff +++ python3.4-3.4.1/debian/patches/setup-modules.diff @@ -0,0 +1,52 @@ +# DP: Modules/Setup.dist: patches to build some extensions statically + +Index: b/Modules/Setup.dist +=================================================================== +--- a/Modules/Setup.dist ++++ b/Modules/Setup.dist +@@ -175,7 +175,7 @@ + #_weakref _weakref.c # basic weak reference support + #_testcapi _testcapimodule.c # Python C API test module + #_random _randommodule.c # Random number generator +-#_elementtree -I$(srcdir)/Modules/expat -DHAVE_EXPAT_CONFIG_H -DUSE_PYEXPAT_CAPI _elementtree.c # elementtree accelerator ++#_elementtree _elementtree.c -lexpat # elementtree accelerator + #_pickle _pickle.c # pickle accelerator + #_datetime _datetimemodule.c # datetime accelerator + #_bisect _bisectmodule.c # Bisection algorithms +@@ -204,10 +204,7 @@ + + # Socket module helper for SSL support; you must comment out the other + # socket line above, and possibly edit the SSL variable: +-#SSL=/usr/local/ssl +-#_ssl _ssl.c \ +-# -DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \ +-# -L$(SSL)/lib -lssl -lcrypto ++#_ssl _ssl.c -lssl -lcrypto + + # The crypt module is now disabled by default because it breaks builds + # on many systems (where -lcrypt is needed), e.g. Linux (I believe). +@@ -249,6 +246,7 @@ + #_sha256 sha256module.c + #_sha512 sha512module.c + ++#_hashlib _hashopenssl.c -lssl -lcrypto + + # The _tkinter module. + # +@@ -337,6 +335,7 @@ + # Fred Drake's interface to the Python parser + #parser parsermodule.c + ++#_ctypes _ctypes/_ctypes.c _ctypes/callbacks.c _ctypes/callproc.c _ctypes/stgdict.c _ctypes/cfield.c _ctypes/malloc_closure.c -lffi + + # Lee Busby's SIGFPE modules. + # The library to link fpectl with is platform specific. +@@ -371,7 +370,7 @@ + # + # More information on Expat can be found at www.libexpat.org. + # +-#pyexpat expat/xmlparse.c expat/xmlrole.c expat/xmltok.c pyexpat.c -I$(srcdir)/Modules/expat -DHAVE_EXPAT_CONFIG_H -DUSE_PYEXPAT_CAPI ++#pyexpat pyexpat.c -lexpat + + # Hye-Shik Chang's CJKCodecs + --- python3.4-3.4.1.orig/debian/patches/site-locations.diff +++ python3.4-3.4.1/debian/patches/site-locations.diff @@ -0,0 +1,51 @@ +# DP: Set site-packages/dist-packages + +Index: b/Lib/site.py +=================================================================== +--- a/Lib/site.py ++++ b/Lib/site.py +@@ -7,12 +7,18 @@ + This will append site-specific paths to the module search path. On + Unix (including Mac OSX), it starts with sys.prefix and + sys.exec_prefix (if different) and appends +-lib/python/site-packages as well as lib/site-python. ++lib/python3/dist-packages as well as lib/site-python. + On other platforms (such as Windows), it tries each of the + prefixes directly, as well as with lib/site-packages appended. The + resulting directories, if they exist, are appended to sys.path, and + also inspected for path configuration files. + ++For Debian and derivatives, this sys.path is augmented with directories ++for packages distributed within the distribution. Local addons go ++into /usr/local/lib/python/dist-packages, Debian addons ++install into /usr/lib/python3/dist-packages. ++/usr/lib/python/site-packages is not used. ++ + If a file named "pyvenv.cfg" exists one directory above sys.executable, + sys.prefix and sys.exec_prefix are set to that directory and + it is also checked for site-packages and site-python (sys.base_prefix and +@@ -304,10 +310,21 @@ + seen.add(prefix) + + if os.sep == '/': ++ if 'VIRTUAL_ENV' in os.environ or sys.base_prefix != sys.prefix: ++ sitepackages.append(os.path.join(prefix, "lib", ++ "python" + sys.version[:3], ++ "site-packages")) ++ sitepackages.append(os.path.join(prefix, "local/lib", ++ "python" + sys.version[:3], ++ "dist-packages")) ++ sitepackages.append(os.path.join(prefix, "lib", ++ "python3", ++ "dist-packages")) ++ # this one is deprecated for Debian + sitepackages.append(os.path.join(prefix, "lib", +- "python" + sys.version[:3], +- "site-packages")) +- sitepackages.append(os.path.join(prefix, "lib", "site-python")) ++ "python" + sys.version[:3], ++ "dist-packages")) ++ sitepackages.append(os.path.join(prefix, "lib", "dist-python")) + else: + sitepackages.append(prefix) + sitepackages.append(os.path.join(prefix, "lib", "site-packages")) --- python3.4-3.4.1.orig/debian/patches/sysconfig-debian-schemes.diff +++ python3.4-3.4.1/debian/patches/sysconfig-debian-schemes.diff @@ -0,0 +1,67 @@ +# DP: Add schemes 'deb_system' and 'posix_local', make the latter the default + +--- a/Lib/sysconfig.py ++++ b/Lib/sysconfig.py +@@ -32,6 +32,30 @@ + 'scripts': '{base}/bin', + 'data': '{base}', + }, ++ 'deb_system': { ++ 'stdlib': '{installed_base}/lib/python{py_version_short}', ++ 'platstdlib': '{platbase}/lib/python{py_version_short}', ++ 'purelib': '{base}/lib/python3/dist-packages', ++ 'platlib': '{platbase}/lib/python3/dist-packages', ++ 'include': ++ '{installed_base}/include/python{py_version_short}{abiflags}', ++ 'platinclude': ++ '{installed_platbase}/include/python{py_version_short}{abiflags}', ++ 'scripts': '{base}/bin', ++ 'data': '{base}', ++ }, ++ 'posix_local': { ++ 'stdlib': '{installed_base}/lib/python{py_version_short}', ++ 'platstdlib': '{platbase}/lib/python{py_version_short}', ++ 'purelib': '{base}/local/lib/python{py_version_short}/dist-packages', ++ 'platlib': '{platbase}/local/lib/python{py_version_short}/dist-packages', ++ 'include': ++ '{installed_base}/local/include/python{py_version_short}{abiflags}', ++ 'platinclude': ++ '{installed_platbase}/local/include/python{py_version_short}{abiflags}', ++ 'scripts': '{base}/local/bin', ++ 'data': '{base}', ++ }, + 'posix_home': { + 'stdlib': '{installed_base}/lib/python', + 'platstdlib': '{base}/lib/python', +@@ -162,7 +186,7 @@ + _PYTHON_BUILD = is_python_build(True) + + if _PYTHON_BUILD: +- for scheme in ('posix_prefix', 'posix_home'): ++ for scheme in ('posix_prefix', 'posix_home', 'posix_local', 'deb_system'): + _INSTALL_SCHEMES[scheme]['include'] = '{srcdir}/Include' + _INSTALL_SCHEMES[scheme]['platinclude'] = '{projectbase}/.' + +@@ -200,7 +224,12 @@ + def _get_default_scheme(): + if os.name == 'posix': + # the default scheme for posix is posix_prefix +- return 'posix_prefix' ++ if 'real_prefix' in sys.__dict__ or 'VIRTUAL_ENV' in os.environ: ++ # virtual environments ++ return 'posix_prefix' ++ else: ++ # Debian default ++ return 'posix_local' + return os.name + + +@@ -485,7 +514,7 @@ + else: + inc_dir = _sys_home or _PROJECT_BASE + else: +- inc_dir = get_path('platinclude') ++ inc_dir = get_path('platinclude', 'posix_prefix') + return os.path.join(inc_dir, 'pyconfig.h') + + --- python3.4-3.4.1.orig/debian/patches/sysconfigdata.diff +++ python3.4-3.4.1/debian/patches/sysconfigdata.diff @@ -0,0 +1,76 @@ +# DP: Issue #15298: Generate _sysconfigdata.py in the build dir, not the source dir. + +diff -r 2ecdda96f970 Lib/sysconfig.py +--- a/Lib/sysconfig.py Tue Jul 10 18:27:54 2012 +0200 ++++ b/Lib/sysconfig.py Tue Jul 10 22:06:43 2012 +0200 +@@ -390,7 +390,7 @@ + if _PYTHON_BUILD: + vars['LDSHARED'] = vars['BLDSHARED'] + +- destfile = os.path.join(os.path.dirname(__file__), '_sysconfigdata.py') ++ destfile = '_sysconfigdata.py' + with open(destfile, 'w', encoding='utf8') as f: + f.write('# system configuration generated and used by' + ' the sysconfig module\n') +diff -r 2ecdda96f970 Makefile.pre.in +--- a/Makefile.pre.in Tue Jul 10 18:27:54 2012 +0200 ++++ b/Makefile.pre.in Tue Jul 10 22:06:43 2012 +0200 +@@ -410,7 +410,7 @@ + Objects/unicodectype.o \ + Objects/weakrefobject.o + +-SYSCONFIGDATA=$(srcdir)/Lib/_sysconfigdata.py ++SYSCONFIGDATA=_sysconfigdata.py + + ########################################################################## + # objects that get linked into the Python library +@@ -472,6 +472,9 @@ + # Generate the sysconfig build-time data + $(SYSCONFIGDATA): $(BUILDPYTHON) + $(RUNSHARED) $(PYTHON_FOR_BUILD) -S -m sysconfig --generate-posix-vars ++ $(RUNSHARED) $(PYTHON_FOR_BUILD) -S -c 'import os,sys ; from distutils.util import get_platform ; d=os.path.join("build", "lib."+get_platform()+"-"+sys.version[0:3]+("-pydebug" if hasattr(sys, "gettotalrefcount") else "")); print(d, end="")' > pybuilddir.txt ++ mkdir -p `cat pybuilddir.txt` ++ cp $(SYSCONFIGDATA) `cat pybuilddir.txt`/. + + # Build the shared modules + sharedmods: $(BUILDPYTHON) $(SYSCONFIGDATA) +@@ -1036,7 +1039,7 @@ + else true; \ + fi; \ + done +- @for i in $(srcdir)/Lib/*.py ; \ ++ @for i in $(srcdir)/Lib/*.py $(SYSCONFIGDATA); \ + do \ + if test -x $$i; then \ + $(INSTALL_SCRIPT) $$i $(DESTDIR)$(LIBDEST); \ +diff -r 2ecdda96f970 setup.py +--- a/setup.py Tue Jul 10 18:27:54 2012 +0200 ++++ b/setup.py Tue Jul 10 22:06:43 2012 +0200 +@@ -33,10 +33,6 @@ + # This global variable is used to hold the list of modules to be disabled. + disabled_module_list = [] + +-# File which contains the directory for shared mods (for sys.path fixup +-# when running from the build dir, see Modules/getpath.c) +-_BUILDDIR_COOKIE = "pybuilddir.txt" +- + def add_dir_to_list(dirlist, dir): + """Add the directory 'dir' to the list 'dirlist' (after any relative + directories) if: +@@ -250,12 +246,9 @@ + args['compiler_so'] = compiler + ' ' + ccshared + ' ' + cflags + self.compiler.set_executables(**args) + +- # Not only do we write the builddir cookie, but we manually install +- # the shared modules directory if it isn't already in sys.path. +- # Otherwise trying to import the extensions after building them +- # will fail. +- with open(_BUILDDIR_COOKIE, "wb") as f: +- f.write(self.build_lib.encode('utf-8', 'surrogateescape')) ++ # We manually install the shared modules directory if it isn't ++ # already in sys.path. Otherwise trying to import the ++ # extensions after building them will fail. + abs_build_lib = os.path.join(os.getcwd(), self.build_lib) + if abs_build_lib not in sys.path: + sys.path.append(abs_build_lib) + --- python3.4-3.4.1.orig/debian/patches/tempfile-minimal.diff +++ python3.4-3.4.1/debian/patches/tempfile-minimal.diff @@ -0,0 +1,170 @@ +# DP: Avoid shutil import when it is not available. + +Index: b/Lib/tempfile.py +=================================================================== +--- a/Lib/tempfile.py ++++ b/Lib/tempfile.py +@@ -31,7 +31,146 @@ import functools as _functools + import warnings as _warnings + import io as _io + import os as _os +-import shutil as _shutil ++try: ++ import shutil as _shutil ++ _rmtree = _shutil.rmtree ++except ImportError: ++ import sys as _sys ++ import stat as _stat ++ # version vulnerable to race conditions ++ def _rmtree_unsafe(path, onerror): ++ try: ++ if _os.path.islink(path): ++ # symlinks to directories are forbidden, see bug #1669 ++ raise OSError("Cannot call rmtree on a symbolic link") ++ except OSError: ++ onerror(_os.path.islink, path, _sys.exc_info()) ++ # can't continue even if onerror hook returns ++ return ++ names = [] ++ try: ++ names = _os.listdir(path) ++ except OSError: ++ onerror(_os.listdir, path, _sys.exc_info()) ++ for name in names: ++ fullname = _os.path.join(path, name) ++ try: ++ mode = _os.lstat(fullname).st_mode ++ except OSError: ++ mode = 0 ++ if _stat.S_ISDIR(mode): ++ _rmtree_unsafe(fullname, onerror) ++ else: ++ try: ++ _os.unlink(fullname) ++ except OSError: ++ onerror(_os.unlink, fullname, _sys.exc_info()) ++ try: ++ _os.rmdir(path) ++ except OSError: ++ onerror(_os.rmdir, path, _sys.exc_info()) ++ ++ # Version using fd-based APIs to protect against races ++ def _rmtree_safe_fd(topfd, path, onerror): ++ names = [] ++ try: ++ names = _os.listdir(topfd) ++ except OSError as err: ++ err.filename = path ++ onerror(_os.listdir, path, _sys.exc_info()) ++ for name in names: ++ fullname = _os.path.join(path, name) ++ try: ++ orig_st = _os.stat(name, dir_fd=topfd, follow_symlinks=False) ++ mode = orig_st.st_mode ++ except OSError: ++ mode = 0 ++ if _stat.S_ISDIR(mode): ++ try: ++ dirfd = _os.open(name, _os.O_RDONLY, dir_fd=topfd) ++ except OSError: ++ onerror(_os.open, fullname, _sys.exc_info()) ++ else: ++ try: ++ if _os.path.samestat(orig_st, _os.fstat(dirfd)): ++ _rmtree_safe_fd(dirfd, fullname, onerror) ++ try: ++ _os.rmdir(name, dir_fd=topfd) ++ except OSError: ++ onerror(_os.rmdir, fullname, _sys.exc_info()) ++ else: ++ try: ++ # This can only happen if someone replaces ++ # a directory with a symlink after the call to ++ # stat.S_ISDIR above. ++ raise OSError("Cannot call rmtree on a symbolic " ++ "link") ++ except OSError: ++ onerror(_os.path.islink, fullname, _sys.exc_info()) ++ finally: ++ _os.close(dirfd) ++ else: ++ try: ++ _os.unlink(name, dir_fd=topfd) ++ except OSError: ++ onerror(_os.unlink, fullname, _sys.exc_info()) ++ ++ _use_fd_functions = ({_os.open, _os.stat, _os.unlink, _os.rmdir} <= ++ _os.supports_dir_fd and ++ _os.listdir in _os.supports_fd and ++ _os.stat in _os.supports_follow_symlinks) ++ ++ def _rmtree(path, ignore_errors=False, onerror=None): ++ """Recursively delete a directory tree. ++ ++ If ignore_errors is set, errors are ignored; otherwise, if onerror ++ is set, it is called to handle the error with arguments (func, ++ path, exc_info) where func is platform and implementation dependent; ++ path is the argument to that function that caused it to fail; and ++ exc_info is a tuple returned by sys.exc_info(). If ignore_errors ++ is false and onerror is None, an exception is raised. ++ ++ """ ++ if ignore_errors: ++ def onerror(*args): ++ pass ++ elif onerror is None: ++ def onerror(*args): ++ raise ++ if _use_fd_functions: ++ # While the unsafe rmtree works fine on bytes, the fd based does not. ++ if isinstance(path, bytes): ++ path = _os.fsdecode(path) ++ # Note: To guard against symlink races, we use the standard ++ # lstat()/open()/fstat() trick. ++ try: ++ orig_st = _os.lstat(path) ++ except Exception: ++ onerror(_os.lstat, path, _sys.exc_info()) ++ return ++ try: ++ fd = _os.open(path, _os.O_RDONLY) ++ except Exception: ++ onerror(_os.lstat, path, _sys.exc_info()) ++ return ++ try: ++ if _os.path.samestat(orig_st, _os.fstat(fd)): ++ _rmtree_safe_fd(fd, path, onerror) ++ try: ++ _os.rmdir(path) ++ except OSError: ++ onerror(_os.rmdir, path, _sys.exc_info()) ++ else: ++ try: ++ # symlinks to directories are forbidden, see bug #1669 ++ raise OSError("Cannot call rmtree on a symbolic link") ++ except OSError: ++ onerror(_os.path.islink, path, _sys.exc_info()) ++ finally: ++ _os.close(fd) ++ else: ++ return _rmtree_unsafe(path, onerror) ++ + import errno as _errno + from random import Random as _Random + import weakref as _weakref +@@ -676,7 +815,7 @@ class TemporaryDirectory(object): + + @classmethod + def _cleanup(cls, name, warn_message=None): +- _shutil.rmtree(name) ++ _rmtree(name) + if warn_message is not None: + _warnings.warn(warn_message, ResourceWarning) + +@@ -694,5 +833,5 @@ class TemporaryDirectory(object): + if self._finalizer is not None: + self._finalizer.detach() + if self.name is not None and not self._closed: +- _shutil.rmtree(self.name) ++ _rmtree(self.name) + self._closed = True --- python3.4-3.4.1.orig/debian/patches/test-no-random-order.diff +++ python3.4-3.4.1/debian/patches/test-no-random-order.diff @@ -0,0 +1,14 @@ +# DP: Don't run the test suite in random order. + +Index: b/Tools/scripts/run_tests.py +=================================================================== +--- a/Tools/scripts/run_tests.py ++++ b/Tools/scripts/run_tests.py +@@ -39,7 +39,6 @@ def main(regrtest_args): + args.extend(['-W', 'error::BytesWarning']) + + args.extend(['-m', 'test', # Run the test suite +- '-r', # Randomize test order + '-w', # Re-run failed tests in verbose mode + ]) + if sys.platform == 'win32': --- python3.4-3.4.1.orig/debian/patches/tkinter-import.diff +++ python3.4-3.4.1/debian/patches/tkinter-import.diff @@ -0,0 +1,18 @@ +# DP: suggest installation of python-tk package on failing _tkinter import + +Index: b/Lib/tkinter/__init__.py +=================================================================== +--- a/Lib/tkinter/__init__.py ++++ b/Lib/tkinter/__init__.py +@@ -35,7 +35,10 @@ + # Attempt to configure Tcl/Tk without requiring PATH + from tkinter import _fix + +-import _tkinter # If this fails your Python may not be configured for Tk ++try: ++ import _tkinter ++except ImportError as msg: ++ raise ImportError(str(msg) + ', please install the python3-tk package') + TclError = _tkinter.TclError + from tkinter.constants import * + import re --- python3.4-3.4.1.orig/debian/pdb.1.in +++ python3.4-3.4.1/debian/pdb.1.in @@ -0,0 +1,16 @@ +.TH PDB@VER@ 1 +.SH NAME +pdb@VER@ \- the Python debugger +.SH SYNOPSIS +.PP +.B pdb@VER@ +.I script [...] +.SH DESCRIPTION +.PP +See /usr/lib/python@VER@/pdb.doc for more information on the use +of pdb. When the debugger is started, help is available via the +help command. +.SH SEE ALSO +python@VER@(1). Chapter 9 of the Python Library Reference +(The Python Debugger). Available in the python@VER@-doc package at +/usr/share/doc/python@VER@/html/lib/module-pdb.html. --- python3.4-3.4.1.orig/debian/pydoc.1.in +++ python3.4-3.4.1/debian/pydoc.1.in @@ -0,0 +1,53 @@ +.TH PYDOC@VER@ 1 +.SH NAME +pydoc@VER@ \- the Python documentation tool +.SH SYNOPSIS +.PP +.B pydoc@VER@ +.I name +.PP +.B pydoc@VER@ -k +.I keyword +.PP +.B pydoc@VER@ -p +.I port +.PP +.B pydoc@VER@ -g +.PP +.B pydoc@VER@ -w +.I module [...] +.SH DESCRIPTION +.PP +.B pydoc@VER@ +.I name +Show text documentation on something. +.I name +may be the name of a +Python keyword, topic, function, module, or package, or a dotted +reference to a class or function within a module or module in a +package. If +.I name +contains a '/', it is used as the path to a +Python source file to document. If name is 'keywords', 'topics', +or 'modules', a listing of these things is displayed. +.PP +.B pydoc@VER@ -k +.I keyword +Search for a keyword in the synopsis lines of all available modules. +.PP +.B pydoc@VER@ -p +.I port +Start an HTTP server on the given port on the local machine. +.PP +.B pydoc@VER@ -g +Pop up a graphical interface for finding and serving documentation. +.PP +.B pydoc@VER@ -w +.I name [...] +Write out the HTML documentation for a module to a file in the current +directory. If +.I name +contains a '/', it is treated as a filename; if +it names a directory, documentation is written for all the contents. +.SH AUTHOR +Moshe Zadka, based on "pydoc --help" --- python3.4-3.4.1.orig/debian/pygettext.1 +++ python3.4-3.4.1/debian/pygettext.1 @@ -0,0 +1,108 @@ +.TH PYGETTEXT 1 "" "pygettext 1.4" +.SH NAME +pygettext \- Python equivalent of xgettext(1) +.SH SYNOPSIS +.B pygettext +[\fIOPTIONS\fR] \fIINPUTFILE \fR... +.SH DESCRIPTION +pygettext is deprecated. The current version of xgettext supports +many languages, including Python. + +pygettext uses Python's standard tokenize module to scan Python +source code, generating .pot files identical to what GNU xgettext generates +for C and C++ code. From there, the standard GNU tools can be used. +.PP +pygettext searches only for _() by default, even though GNU xgettext +recognizes the following keywords: gettext, dgettext, dcgettext, +and gettext_noop. See the \fB\-k\fR/\fB\--keyword\fR flag below for how to +augment this. +.PP +.SH OPTIONS +.TP +\fB\-a\fR, \fB\-\-extract\-all\fR +Extract all strings. +.TP +\fB\-d\fR, \fB\-\-default\-domain\fR=\fINAME\fR +Rename the default output file from messages.pot to name.pot. +.TP +\fB\-E\fR, \fB\-\-escape\fR +Replace non-ASCII characters with octal escape sequences. +.TP +\fB\-D\fR, \fB\-\-docstrings\fR +Extract module, class, method, and function docstrings. +These do not need to be wrapped in _() markers, and in fact cannot +be for Python to consider them docstrings. (See also the \fB\-X\fR option). +.TP +\fB\-h\fR, \fB\-\-help\fR +Print this help message and exit. +.TP +\fB\-k\fR, \fB\-\-keyword\fR=\fIWORD\fR +Keywords to look for in addition to the default set, which are: _ +.IP +You can have multiple \fB\-k\fR flags on the command line. +.TP +\fB\-K\fR, \fB\-\-no\-default\-keywords\fR +Disable the default set of keywords (see above). +Any keywords explicitly added with the \fB\-k\fR/\fB\--keyword\fR option +are still recognized. +.TP +\fB\-\-no\-location\fR +Do not write filename/lineno location comments. +.TP +\fB\-n\fR, \fB\-\-add\-location\fR +Write filename/lineno location comments indicating where each +extracted string is found in the source. These lines appear before +each msgid. The style of comments is controlled by the +\fB\-S\fR/\fB\--style\fR option. This is the default. +.TP +\fB\-o\fR, \fB\-\-output\fR=\fIFILENAME\fR +Rename the default output file from messages.pot to FILENAME. +If FILENAME is `-' then the output is sent to standard out. +.TP +\fB\-p\fR, \fB\-\-output\-dir\fR=\fIDIR\fR +Output files will be placed in directory DIR. +.TP +\fB\-S\fR, \fB\-\-style\fR=\fISTYLENAME\fR +Specify which style to use for location comments. +Two styles are supported: +.RS +.IP \(bu 4 +Solaris # File: filename, line: line-number +.IP \(bu 4 +GNU #: filename:line +.RE +.IP +The style name is case insensitive. +GNU style is the default. +.TP +\fB\-v\fR, \fB\-\-verbose\fR +Print the names of the files being processed. +.TP +\fB\-V\fR, \fB\-\-version\fR +Print the version of pygettext and exit. +.TP +\fB\-w\fR, \fB\-\-width\fR=\fICOLUMNS\fR +Set width of output to columns. +.TP +\fB\-x\fR, \fB\-\-exclude\-file\fR=\fIFILENAME\fR +Specify a file that contains a list of strings that are not be +extracted from the input files. Each string to be excluded must +appear on a line by itself in the file. +.TP +\fB\-X\fR, \fB\-\-no\-docstrings\fR=\fIFILENAME\fR +Specify a file that contains a list of files (one per line) that +should not have their docstrings extracted. This is only useful in +conjunction with the \fB\-D\fR option above. +.PP +If `INPUTFILE' is -, standard input is read. +.SH BUGS +pygettext attempts to be option and feature compatible with GNU xgettext +where ever possible. However some options are still missing or are not fully +implemented. Also, xgettext's use of command line switches with option +arguments is broken, and in these cases, pygettext just defines additional +switches. +.SH AUTHOR +pygettext is written by Barry Warsaw . +.PP +Joonas Paalasmaa put this manual page together +based on "pygettext --help". --- python3.4-3.4.1.orig/debian/pyhtml2devhelp.py +++ python3.4-3.4.1/debian/pyhtml2devhelp.py @@ -0,0 +1,273 @@ +#! /usr/bin/python3 + +from html.parser import HTMLParser +import formatter +import os, sys, re + +class PyHTMLParser(HTMLParser): + pages_to_include = set(('whatsnew/index.html', 'tutorial/index.html', 'using/index.html', + 'reference/index.html', 'library/index.html', 'howto/index.html', + 'extending/index.html', 'c-api/index.html', 'install/index.html', + 'distutils/index.html')) + + def __init__(self, formatter, basedir, fn, indent, parents=set()): + HTMLParser.__init__(self, formatter) + self.basedir = basedir + self.dir, self.fn = os.path.split(fn) + self.data = '' + self.parents = parents + self.link = {} + self.indent = indent + self.last_indent = indent - 1 + self.sub_indent = 0 + self.sub_count = 0 + self.next_link = False + + def process_link(self): + new_href = os.path.join(self.dir, self.link['href']) + text = self.link['text'] + indent = self.indent + self.sub_indent + if self.last_indent == indent: + print('%s' % (' ' * self.last_indent)) + self.sub_count -= 1 + print('%s' % (' ' * indent, new_href, text)) + self.sub_count += 1 + self.last_indent = self.indent + self.sub_indent + + def handle_starttag(self, tag, attrs): + if tag == 'a': + self.start_a(attrs) + elif tag == 'li': + self.start_li(attrs) + + def handle_endtag(self, tag): + if tag == 'a': + self.end_a() + elif tag == 'li': + self.end_li() + + def start_li(self, attrs): + self.sub_indent += 1 + self.next_link = True + + def end_li(self): + indent = self.indent + self.sub_indent + if self.sub_count > 0: + print('%s' % (' ' * self.last_indent)) + self.sub_count -= 1 + self.last_indent -= 1 + self.sub_indent -= 1 + + def start_a(self, attrs): + self.link = {} + for attr in attrs: + self.link[attr[0]] = attr[1] + self.data = '' + + def end_a(self): + process = False + text = self.data.replace('\t', '').replace('\n', ' ').replace('&', '&').replace('<', '<').replace('>', '>') + self.link['text'] = text + # handle a tag without href attribute + try: + href = self.link['href'] + except KeyError: + return + + abs_href = os.path.join(self.basedir, href) + if abs_href in self.parents: + return + if href.startswith('..') or href.startswith('http:') \ + or href.startswith('mailto:') or href.startswith('news:'): + return + if href in ('', 'about.html', 'modindex.html', 'genindex.html', 'glossary.html', + 'search.html', 'contents.html', 'download.html', 'bugs.html', + 'license.html', 'copyright.html'): + return + + if 'class' in self.link: + if self.link['class'] in ('biglink'): + process = True + if self.link['class'] in ('reference external'): + if self.next_link: + process = True + next_link = False + + if process == True: + self.process_link() + if href in self.pages_to_include: + self.parse_file(os.path.join(self.dir, href)) + + def finish(self): + if self.sub_count > 0: + print('%s' % (' ' * self.last_indent)) + + def handle_data(self, data): + self.data += data + + def parse_file(self, href): + # TODO basedir bestimmen + parent = os.path.join(self.basedir, self.fn) + self.parents.add(parent) + parser = PyHTMLParser(formatter.NullFormatter(), + self.basedir, href, self.indent + 1, + self.parents) + text = open(self.basedir + '/' + href, encoding='latin_1').read() + parser.feed(text) + parser.finish() + parser.close() + if parent in self.parents: + self.parents.remove(parent) + +class PyIdxHTMLParser(HTMLParser): + def __init__(self, formatter, basedir, fn, indent): + HTMLParser.__init__(self, formatter) + self.basedir = basedir + self.dir, self.fn = os.path.split(fn) + self.data = '' + self.link = {} + self.indent = indent + self.active = False + self.indented = False + self.nolink = False + self.header = '' + self.last_letter = 'Z' + self.last_text = '' + + def process_link(self): + new_href = os.path.join(self.dir, self.link['href']) + text = self.link['text'] + if not self.active: + return + if text.startswith('['): + return + if self.link.get('rel', None) in ('prev', 'parent', 'next', 'contents', 'index'): + return + if self.indented: + text = self.last_text + ' ' + text + else: + # Save it in case we need it again + self.last_text = re.sub(' \([\w\-\.\s]+\)', '', text) + indent = self.indent + print('%s' % (' ' * indent, new_href, text)) + + def handle_starttag(self, tag, attrs): + if tag == 'a': + self.start_a(attrs) + elif tag == 'dl': + self.start_dl(attrs) + elif tag == 'dt': + self.start_dt(attrs) + elif tag == 'h2': + self.start_h2(attrs) + elif tag == 'td': + self.start_td(attrs) + elif tag == 'table': + self.start_table(attrs) + + def handle_endtag(self, tag): + if tag == 'a': + self.end_a() + elif tag == 'dl': + self.end_dl() + elif tag == 'dt': + self.end_dt() + elif tag == 'h2': + self.end_h2() + elif tag == 'td': + self.end_td() + elif tag == 'table': + self.end_table() + + def start_dl(self, attrs): + if self.last_text: + # Looks like we found the second part to a command + self.indented = True + + def end_dl(self): + self.indented = False + + def start_dt(self, attrs): + self.data = '' + self.nolink = True + + def end_dt(self): + if not self.active: + return + if self.nolink == True: + # Looks like we found the first part to a command + self.last_text = re.sub(' \([\w\-\.\s]+\)', '', self.data) + self.nolink = False + + def start_h2(self, attrs): + for k, v in attrs: + if k == 'id': + self.header = v + if v == '_': + self.active = True + + def end_h2(self): + pass + + def start_td(self, attrs): + self.indented = False + self.last_text = '' + + def end_td(self): + pass + + def start_table(self, attrs): + pass + + def end_table(self): + if self.header == self.last_letter: + self.active = False + + def start_a(self, attrs): + self.nolink = False + self.link = {} + for attr in attrs: + self.link[attr[0]] = attr[1] + self.data = '' + + def end_a(self): + text = self.data.replace('\t', '').replace('\n', ' ') + text = text.replace("Whats ", "What's ") + self.link['text'] = text + # handle a tag without href attribute + try: + href = self.link['href'] + except KeyError: + return + self.process_link() + + def handle_data(self, data): + self.data += data + + def handle_entityref(self, name): + self.data += '&%s;' % name + +def main(): + base = sys.argv[1] + fn = sys.argv[2] + version = sys.argv[3] + + parser = PyHTMLParser(formatter.NullFormatter(), base, fn, indent=0) + print('') + print('' % (version, version, version)) + print('') + parser.parse_file(fn) + print('') + + print('') + + fn = 'genindex-all.html' + parser = PyIdxHTMLParser(formatter.NullFormatter(), base, fn, indent=1) + text = open(base + '/' + fn, encoding='latin_1').read() + parser.feed(text) + parser.close() + + print('') + print('') + +main() --- python3.4-3.4.1.orig/debian/pylogo.xpm +++ python3.4-3.4.1/debian/pylogo.xpm @@ -0,0 +1,351 @@ +/* XPM */ +static char * pylogo_xpm[] = { +"32 32 316 2", +" c None", +". c #8DB0CE", +"+ c #6396BF", +"@ c #4985B7", +"# c #4181B5", +"$ c #417EB2", +"% c #417EB1", +"& c #4D83B0", +"* c #6290B6", +"= c #94B2CA", +"- c #70A1C8", +"; c #3D83BC", +"> c #3881BD", +", c #387DB6", +"' c #387CB5", +") c #387BB3", +"! c #3779B0", +"~ c #3778AE", +"{ c #3776AB", +"] c #3776AA", +"^ c #3775A9", +"/ c #4A7FAC", +"( c #709FC5", +"_ c #3A83BE", +": c #5795C7", +"< c #94B9DB", +"[ c #73A4CE", +"} c #3D80B7", +"| c #387CB4", +"1 c #377AB2", +"2 c #377AB0", +"3 c #3777AC", +"4 c #3774A7", +"5 c #3773A5", +"6 c #3C73A5", +"7 c #4586BB", +"8 c #4489C1", +"9 c #A7C7E1", +"0 c #F7F9FD", +"a c #E1E9F1", +"b c #4C89BC", +"c c #3779AF", +"d c #3778AD", +"e c #3873A5", +"f c #4B7CA4", +"g c #3982BE", +"h c #4389C1", +"i c #A6C6E1", +"j c #F6F9FC", +"k c #D6E4F0", +"l c #4A88BB", +"m c #3773A6", +"n c #366F9F", +"o c #366E9D", +"p c #376E9C", +"q c #4A8BC0", +"r c #79A7CD", +"s c #548EBD", +"t c #387AB0", +"u c #3773A4", +"v c #366D9C", +"w c #387FBA", +"x c #387DB7", +"y c #387BB4", +"z c #3775A8", +"A c #366FA0", +"B c #4981AF", +"C c #427BAA", +"D c #3772A4", +"E c #376B97", +"F c #77A3C8", +"G c #4586BC", +"H c #3882BE", +"I c #3B76A7", +"J c #3B76A6", +"K c #366E9E", +"L c #376B98", +"M c #376B96", +"N c #5681A3", +"O c #F5EEB8", +"P c #FFED60", +"Q c #FFE85B", +"R c #FFE659", +"S c #FDE55F", +"T c #5592C4", +"U c #3A83BF", +"V c #3882BD", +"W c #387FB9", +"X c #3779AE", +"Y c #366F9E", +"Z c #366C98", +"` c #376A94", +" . c #5D85A7", +".. c #F5EDB7", +"+. c #FFEA5D", +"@. c #FFE75A", +"#. c #FFE354", +"$. c #FDDD56", +"%. c #669DC8", +"&. c #3885C3", +"*. c #3884C2", +"=. c #387EB8", +"-. c #387CB6", +";. c #377AB1", +">. c #3772A3", +",. c #366D9B", +"'. c #F5EBB5", +"). c #FFE557", +"!. c #FFE455", +"~. c #FFDF50", +"{. c #FFDB4C", +"]. c #FAD862", +"^. c #8EB4D2", +"/. c #3C86C1", +"(. c #3883C0", +"_. c #3882BF", +":. c #3881BC", +"<. c #3880BB", +"[. c #3775AA", +"}. c #F5EAB3", +"|. c #FFE051", +"1. c #FFDE4F", +"2. c #FFDA4A", +"3. c #FED446", +"4. c #F5DF9D", +"5. c #77A5CA", +"6. c #3885C2", +"7. c #387BB2", +"8. c #6B8EA8", +"9. c #F8E7A1", +"0. c #FFE153", +"a. c #FFDD4E", +"b. c #FFDB4B", +"c. c #FFD746", +"d. c #FFD645", +"e. c #FFD342", +"f. c #F6DB8D", +"g. c #508DBE", +"h. c #3771A3", +"i. c #376A95", +"j. c #3D6F97", +"k. c #C3CBC2", +"l. c #FBD964", +"m. c #FFDC4D", +"n. c #FFD544", +"o. c #FFD040", +"p. c #F9CF58", +"q. c #3F83BB", +"r. c #376B95", +"s. c #3A6C95", +"t. c #4E7BA0", +"u. c #91AABC", +"v. c #F6E4A3", +"w. c #FFDA4B", +"x. c #FFD646", +"y. c #FFD443", +"z. c #FFD241", +"A. c #FFCE3D", +"B. c #FFCC3B", +"C. c #FCC83E", +"D. c #3880BC", +"E. c #3C79AC", +"F. c #5F8DB4", +"G. c #7AA0C0", +"H. c #82A6C3", +"I. c #82A3BF", +"J. c #82A2BE", +"K. c #82A1BB", +"L. c #82A1B9", +"M. c #8BA4B5", +"N. c #C1C5AE", +"O. c #F2E19F", +"P. c #FDD74C", +"Q. c #FFD94A", +"R. c #FFD343", +"S. c #FFCE3E", +"T. c #FFCB39", +"U. c #FFC937", +"V. c #FEC636", +"W. c #3D79AB", +"X. c #9DB6C6", +"Y. c #D0CFA2", +"Z. c #EFE598", +"`. c #F8EE9B", +" + c #F8EB97", +".+ c #F8E996", +"++ c #F8E894", +"@+ c #FAE489", +"#+ c #FCDB64", +"$+ c #FFDA4D", +"%+ c #FFCF3E", +"&+ c #FFCB3A", +"*+ c #FFC734", +"=+ c #FFC532", +"-+ c #3F82B7", +";+ c #387EB9", +">+ c #9EB9D0", +",+ c #F2E287", +"'+ c #FDEB69", +")+ c #FEEC60", +"!+ c #FFEB5E", +"~+ c #FFE254", +"{+ c #FFE152", +"]+ c #FFD747", +"^+ c #FFC633", +"/+ c #FCC235", +"(+ c #578FBE", +"_+ c #6996BC", +":+ c #DED9A8", +"<+ c #FEEC62", +"[+ c #FFE658", +"}+ c #FFDF51", +"|+ c #FFDE50", +"1+ c #FFD03F", +"2+ c #FFCD3C", +"3+ c #FFC431", +"4+ c #FFBF2C", +"5+ c #FAC244", +"6+ c #85AACA", +"7+ c #A1BBD2", +"8+ c #F7E47C", +"9+ c #FFE456", +"0+ c #FFC735", +"a+ c #FFBC29", +"b+ c #F7D280", +"c+ c #9DBAD2", +"d+ c #3B7CB2", +"e+ c #ABC2D6", +"f+ c #FDEB7B", +"g+ c #FFC12E", +"h+ c #FDBD30", +"i+ c #F4DEA8", +"j+ c #5F91BA", +"k+ c #ABC1D4", +"l+ c #FDEE7E", +"m+ c #FFE253", +"n+ c #FFCC3C", +"o+ c #FFBA27", +"p+ c #FAC75B", +"q+ c #4A82B0", +"r+ c #3877AB", +"s+ c #3774A6", +"t+ c #AAC0D4", +"u+ c #FDEE7D", +"v+ c #FFEC5F", +"w+ c #FFE255", +"x+ c #FFD848", +"y+ c #FFD444", +"z+ c #FFCF3F", +"A+ c #FFBC2A", +"B+ c #FFBB28", +"C+ c #FDBA32", +"D+ c #447AA8", +"E+ c #4379A7", +"F+ c #FFE95C", +"G+ c #FFE558", +"H+ c #FFE355", +"I+ c #FED84B", +"J+ c #FCD149", +"K+ c #FBCE47", +"L+ c #FBCD46", +"M+ c #FBC840", +"N+ c #FBC63E", +"O+ c #FBC037", +"P+ c #FAC448", +"Q+ c #FDD44C", +"R+ c #FCD14E", +"S+ c #FFC836", +"T+ c #FFC22F", +"U+ c #FFC02D", +"V+ c #FFE052", +"W+ c #FFC636", +"X+ c #FFCF5C", +"Y+ c #FFD573", +"Z+ c #FFC33E", +"`+ c #FEBD2D", +" @ c #FFDB4D", +".@ c #FFD949", +"+@ c #FFD545", +"@@ c #FFD140", +"#@ c #FFCB48", +"$@ c #FFF7E4", +"%@ c #FFFCF6", +"&@ c #FFE09D", +"*@ c #FFBA2E", +"=@ c #FDBE2F", +"-@ c #FFD748", +";@ c #FFCA38", +">@ c #FFC844", +",@ c #FFF2D7", +"'@ c #FFF9EC", +")@ c #FFDB94", +"!@ c #FFB92D", +"~@ c #FAC54D", +"{@ c #FDD54E", +"]@ c #FFBD2D", +"^@ c #FFC858", +"/@ c #FFD174", +"(@ c #FFBF3E", +"_@ c #FCBD3C", +":@ c #FAD66A", +"<@ c #FECD3F", +"[@ c #FFC330", +"}@ c #FFBD2A", +"|@ c #FFB724", +"1@ c #FFB521", +"2@ c #FFB526", +"3@ c #FBC457", +"4@ c #F7E09E", +"5@ c #F8D781", +"6@ c #FAC349", +"7@ c #FCC134", +"8@ c #FEBE2C", +"9@ c #FBBE3F", +"0@ c #F7CF79", +"a@ c #F5D795", +" . + @ # $ % % & * = ", +" - ; > > , ' ) ! ~ { ] ^ / ", +" ( _ : < [ } | 1 2 ~ 3 4 5 5 6 ", +" 7 8 9 0 a b 2 c d 3 { 5 5 5 e f ", +" g h i j k l c ~ { { m 5 5 n o p ", +" > > q r s t c c d 4 5 u n v v v ", +" w x ' y 2 c d d z 5 u A v v v v ", +" B C 5 D v v v v E ", +" F G H H H x ' ) c c c d I J 5 K v v L M N O P Q R S ", +" T U H V V W ' ) c c X ~ 5 5 5 Y v v Z ` ` ...+.@.#.#.$. ", +" %.&.*.> w W =.-.;.c 3 { ^ 5 5 >.o v ,.E ` ` .'.).!.#.~.{.]. ", +"^./.(._.:.<., ' ) ;.X d [.5 5 >.K v ,.E ` ` ` .}.#.|.1.{.2.3.4.", +"5.6.(.H H x ' 7.c c 3 3 4 5 D K v v ,.` ` ` ` 8.9.0.a.b.c.d.e.f.", +"g._.> <.w ' ' | 2 3 { z 5 5 h.v v v i.` ` ` j.k.l.m.{.d.n.e.o.p.", +"q.> > :.-.' 1 c c c ] 5 5 >.v v ,.r.` ` s.t.u.v.{.w.x.y.z.A.B.C.", +"D.D.w -.' 1 c c c E.F.G.H.I.J.J.K.L.L.L.M.N.O.P.Q.c.R.S.B.T.U.V.", +"D.D.=.' ' 1 c c W.X.Y.Z.`.`.`.`.`. +.+++@+#+$+Q.d.R.%+B.&+*+=+=+", +"-+;+-.' ;.2 c c >+,+'+)+P P P !+Q R ~+{+1.{.]+d.y.%+B.&+^+=+=+/+", +"(+' ' ;.c X X _+:+<+P P P P !+R [+~+}+|+{.]+n.R.1+2+&+^+=+3+4+5+", +"6+' ) ! ~ { { 7+8+P P P P !+R 9+#.{+{.w.]+y.z.S.&+0+=+=+3+4+a+b+", +"c+d+7.! d 3 z e+f+P P P !+R 9+#.{+m.{.]+y.1+B.&+0+=+=+g+4+a+h+i+", +" j+c d 3 { 4 k+l+P P !+@.9+m+1.m.{.]+y.1+n+B.*+=+=+g+a+a+o+p+ ", +" q+r+{ s+m t+u+v+@.R w+{+}+{.x+d.y+z+n+B.0+=+=+g+A+a+B+C+ ", +" * D+E+E+ +.F+G+H+}+}+{.I+J+K+L+M+M+M+M+N+O+O+O+O+P+ ", +" ).).#.{+a.{.x+Q+R+ ", +" #.m+1.a.{.x+y.o.2+B.S+=+=+T+U+O+ ", +" 0.V+{.{.x+n.o.2+B.B.W+X+Y+Z+a+`+ ", +" @{..@+@n.@@B.B.S+^+#@$@%@&@*@=@ ", +" ].-@x.y.o.%+;@S+=+=+>@,@'@)@!@~@ ", +" {@z.z+2+U.=+=+=+T+]@^@/@(@_@ ", +" :@<@U.=+=+[@4+}@|@1@2@3@ ", +" 4@5@6@7@8@a+a+9@0@a@ "}; --- python3.4-3.4.1.orig/debian/pymindeps.py +++ python3.4-3.4.1/debian/pymindeps.py @@ -0,0 +1,178 @@ +#! /usr/bin/python3 + +# Matthias Klose +# Modified to only exclude module imports from a given module. + +# Copyright 2004 Toby Dickenson +# +# Permission is hereby granted, free of charge, to any person obtaining +# a copy of this software and associated documentation files (the +# "Software"), to deal in the Software without restriction, including +# without limitation the rights to use, copy, modify, merge, publish, +# distribute, sublicense, and/or sell copies of the Software, and to +# permit persons to whom the Software is furnished to do so, subject +# to the following conditions: +# +# The above copyright notice and this permission notice shall be included +# in all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + +import os, sys, pprint +import modulefinder +import imp + +class mymf(modulefinder.ModuleFinder): + def __init__(self,*args,**kwargs): + self._depgraph = {} + self._types = {} + self._last_caller = None + modulefinder.ModuleFinder.__init__(self, *args, **kwargs) + + def import_hook(self, name, caller=None, fromlist=None, level=-1): + old_last_caller = self._last_caller + try: + self._last_caller = caller + return modulefinder.ModuleFinder.import_hook(self, name, caller, + fromlist, level) + finally: + self._last_caller = old_last_caller + + def import_module(self, partnam, fqname, parent): + m = modulefinder.ModuleFinder.import_module(self, + partnam, fqname, parent) + if m is not None and self._last_caller: + caller = self._last_caller.__name__ + if '.' in caller: + caller = caller[:caller.index('.')] + callee = m.__name__ + if '.' in callee: + callee = callee[:callee.index('.')] + #print "XXX last_caller", caller, "MOD", callee + #self._depgraph.setdefault(self._last_caller.__name__,{})[r.__name__] = 1 + #if caller in ('pdb', 'doctest') or callee in ('pdb', 'doctest'): + # print caller, "-->", callee + if caller != callee: + self._depgraph.setdefault(caller,{})[callee] = 1 + return m + + def find_module(self, name, path, parent=None): + if parent is not None: + # assert path is not None + fullname = parent.__name__+'.'+name + elif name == "__init__": + fullname = os.path.basename(path[0]) + else: + fullname = name + if self._last_caller: + caller = self._last_caller.__name__ + if fullname in excluded_imports.get(caller, []): + #self.msgout(3, "find_module -> Excluded", fullname) + raise ImportError(name) + + if fullname in self.excludes: + #self.msgout(3, "find_module -> Excluded", fullname) + raise ImportError(name) + + if path is None: + if name in sys.builtin_module_names: + return (None, None, ("", "", imp.C_BUILTIN)) + + path = self.path + return imp.find_module(name, path) + + def load_module(self, fqname, fp, pathname, file_info): + suffix, mode, type = file_info + m = modulefinder.ModuleFinder.load_module(self, fqname, + fp, pathname, file_info) + if m is not None: + self._types[m.__name__] = type + return m + + def load_package(self, fqname, pathname): + m = modulefinder.ModuleFinder.load_package(self, fqname,pathname) + if m is not None: + self._types[m.__name__] = imp.PKG_DIRECTORY + return m + +def reduce_depgraph(dg): + pass + +# guarded imports, which don't need to be included in python-minimal +excluded_imports = { + 'argparse': set(('gettext',)), + 'codecs': set(('encodings',)), + 'collections': set(('cPickle', 'pickle', 'doctest')), + 'copy': set(('reprlib',)), + #'functools': set(('_dummy_thread',)), + 'hashlib': set(('logging', '_hashlib')), + #'hashlib': set(('_hashlib', '_md5', '_sha', '_sha256','_sha512',)), + 'heapq': set(('doctest',)), + #'io': set(('_dummy_thread',)), + 'logging': set(('multiprocessing',)), + 'os': set(('nt', 'ntpath', 'os2', 'os2emxpath', 'mac', 'macpath', + 'riscos', 'riscospath', 'riscosenviron')), + 'optparse': set(('gettext',)), + 'pickle': set(('argparse', 'doctest', 'pprint')), + 'platform': set(('plistlib', 'tempfile')), + 'reprlib': set(('_dummy_thread',)), + #'socket': set(('_ssl',)), + '_sitebuiltins': set(('pydoc',)), + 'subprocess': set(('dummy_threading',)), + 'sysconfig': set(('pprint','_osx_support')), + 'tempfile': set(('_dummy_thread', 'shutil')), + } + +def main(argv): + # Parse command line + import getopt + try: + opts, args = getopt.getopt(sys.argv[1:], "dmp:qx:") + except getopt.error as msg: + print(msg) + return + + # Process options + debug = 1 + domods = 0 + addpath = [] + exclude = [] + for o, a in opts: + if o == '-d': + debug = debug + 1 + if o == '-m': + domods = 1 + if o == '-p': + addpath = addpath + a.split(os.pathsep) + if o == '-q': + debug = 0 + if o == '-x': + exclude.append(a) + + path = sys.path[:] + path = addpath + path + + if debug > 1: + print(("version:", sys.version)) + print("path:") + for item in path: + print((" ", repr(item))) + + #exclude = ['__builtin__', 'sys', 'os'] + exclude = [] + mf = mymf(path, debug, exclude) + for arg in args: + mf.run_script(arg) + + depgraph = reduce_depgraph(mf._depgraph) + + pprint.pprint({'depgraph':mf._depgraph, 'types':mf._types}) + +if __name__=='__main__': + main(sys.argv[1:]) --- python3.4-3.4.1.orig/debian/pysetup3.1 +++ python3.4-3.4.1/debian/pysetup3.1 @@ -0,0 +1,42 @@ +.\" DO NOT MODIFY THIS FILE! It was generated by help2man 1.40.4. +.TH PYSETUP3.3 "1" "January 2012" "pysetup3.3 3.3" "User Commands" +.SH NAME +pysetup3.3 \- pysetup tool +.SH SYNOPSIS +.B pysetup +[\fIoptions\fR] \fIaction \fR[\fIaction_options\fR] +.SH DESCRIPTION +.SS "Actions:" +.IP +run: Run one or several commands +metadata: Display the metadata of a project +install: Install a project +remove: Remove a project +search: Search for a project in the indexes +list: List installed projects +graph: Display a graph +create: Create a project +generate\-setup: Generate a backward\-compatible setup.py +.PP +To get more help on an action, use: +.IP +pysetup action \fB\-\-help\fR +.SS "Global options:" +.TP +\fB\-\-verbose\fR (\fB\-v\fR) +run verbosely (default) +.TP +\fB\-\-quiet\fR (\fB\-q\fR) +run quietly (turns verbosity off) +.TP +\fB\-\-dry\-run\fR (\fB\-n\fR) +don't actually do anything +.TP +\fB\-\-help\fR (\fB\-h\fR) +show detailed help message +.TP +\fB\-\-no\-user\-cfg\fR +ignore pydistutils.cfg in your home directory +.TP +\fB\-\-version\fR +Display the version --- python3.4-3.4.1.orig/debian/python3-config.1 +++ python3.4-3.4.1/debian/python3-config.1 @@ -0,0 +1,102 @@ +.TH PYTHON\-CONFIG 1 "November 27, 2011" +.SH NAME +python\-config \- output build options for python C/C++ extensions or embedding +.SH SYNOPSIS +.BI "python\-config" +[ +.BI "\-\-prefix" +] +[ +.BI "\-\-exec\-prefix" +] +[ +.BI "\-\-includes" +] +[ +.BI "\-\-libs" +] +[ +.BI "\-\-cflags" +] +[ +.BI "\-\-ldflags" +] +[ +.BI "\-\-extension\-suffix" +] +[ +.BI "\-\-abiflags" +] +[ +.BI "\-\-help" +] +.SH DESCRIPTION +.B python\-config +helps compiling and linking programs, which embed the Python interpreter, or +extension modules that can be loaded dynamically (at run time) into +the interpreter. +.SH OPTIONS +.TP +.BI "\-\-abiflags" +print the the ABI flags as specified by PEP 3149. +.TP +.BI "\-\-cflags" +print the C compiler flags. +.TP +.BI "\-\-ldflags" +print the flags that should be passed to the linker. +.TP +.BI "\-\-includes" +similar to \fI\-\-cflags\fP but only with \-I options (path to python header files). +.TP +.BI "\-\-libs" +similar to \fI\-\-ldflags\fP but only with \-l options (used libraries). +.TP +.BI "\-\-prefix" +prints the prefix (base directory) under which python can be found. +.TP +.BI "\-\-exec\-prefix" +print the prefix used for executable program directories (such as bin, sbin, etc). +.TP +.BI "\-\-extension\-suffix" +print the extension suffix used for binary extensions. +.TP +.BI "\-\-help" +print the usage message. +.PP + +.SH EXAMPLES +To build the singe\-file c program \fIprog\fP against the python library, use +.PP +.RS +gcc $(python\-config \-\-cflags \-\-ldflags) progr.cpp \-o progr.cpp +.RE +.PP +The same in a makefile: +.PP +.RS +CFLAGS+=$(shell python\-config \-\-cflags) +.RE +.RS +LDFLAGS+=$(shell python\-config \-\-ldflags) +.RE +.RS +all: progr +.RE + +To build a dynamically loadable python module, use +.PP +.RS +gcc $(python\-config \-\-cflags \-\-ldflags) \-shared \-fPIC progr.cpp \-o progr.so +.RE + +.SH "SEE ALSO" +python (1) +.br +http://docs.python.org/extending/extending.html +.br +/usr/share/doc/python/faq/extending.html + +.SH AUTHORS +This manual page was written by Johann Felix Soden +for the Debian project (and may be used by others). --- python3.4-3.4.1.orig/debian/pyvenv3.1 +++ python3.4-3.4.1/debian/pyvenv3.1 @@ -0,0 +1,34 @@ +.\" DO NOT MODIFY THIS FILE! It was generated by help2man 1.40.10. +.TH PYVENV-3.3 "1" "June 2012" "pyvenv-3.3 3.3" "User Commands" +.SH NAME +pyvenv-3.3 \- create virtual python environments +.SH DESCRIPTION +usage: venv [\-h] [\-\-system\-site\-packages] [\-\-symlinks] [\-\-clear] [\-\-upgrade] +.IP +ENV_DIR [ENV_DIR ...] +.PP +Creates virtual Python environments in one or more target directories. +.SS "positional arguments:" +.TP +ENV_DIR +A directory to create the environment in. +.SS "optional arguments:" +.TP +\fB\-h\fR, \fB\-\-help\fR +show this help message and exit +.TP +\fB\-\-system\-site\-packages\fR +Give the virtual environment access to the system +site\-packages dir. +.TP +\fB\-\-symlinks\fR +Attempt to symlink rather than copy. +.TP +\fB\-\-clear\fR +Delete the environment directory if it already exists. +If not specified and the directory exists, an error is +raised. +.TP +\fB\-\-upgrade\fR +Upgrade the environment directory to use this version +of Python, assuming Python has been upgraded in\-place. --- python3.4-3.4.1.orig/debian/rules +++ python3.4-3.4.1/debian/rules @@ -0,0 +1,1432 @@ +#!/usr/bin/make -f + +unexport LANG LC_ALL LC_CTYPE LC_COLLATE LC_TIME LC_NUMERIC LC_MESSAGES +unexport CFLAGS CXXFLAGS LDFLAGS CPPFLAGS + +export SHELL = /bin/bash + +# Uncomment this to turn on verbose mode. +#export DH_VERBOSE=1 + +vafilt = $(subst $(2)=,,$(filter $(2)=%,$(1))) +DPKG_VARS := $(shell dpkg-architecture) +DEB_BUILD_ARCH ?= $(call vafilt,$(DPKG_VARS),DEB_BUILD_ARCH) +DEB_BUILD_GNU_TYPE ?= $(call vafilt,$(DPKG_VARS),DEB_BUILD_GNU_TYPE) +DEB_HOST_ARCH ?= $(call vafilt,$(DPKG_VARS),DEB_HOST_ARCH) +DEB_HOST_ARCH_ENDIAN ?= $(call vafilt,$(DPKG_VARS),DEB_HOST_ARCH_ENDIAN) +DEB_HOST_ARCH_OS ?= $(call vafilt,$(DPKG_VARS),DEB_HOST_ARCH_OS) +DEB_HOST_GNU_TYPE ?= $(call vafilt,$(DPKG_VARS),DEB_HOST_GNU_TYPE) +DEB_HOST_MULTIARCH ?= $(call vafilt,$(DPKG_VARS),DEB_HOST_MULTIARCH) + +CHANGELOG_VARS := $(shell dpkg-parsechangelog | \ + sed -n 's/ /_/g;/^[^_]/s/^\([^:]*\):_\(.*\)/\1=\2/p') +PKGSOURCE := $(call vafilt,$(CHANGELOG_VARS),Source) +PKGVERSION := $(call vafilt,$(CHANGELOG_VARS),Version) + +on_buildd := $(shell [ -f /CurrentlyBuilding -o "$$LOGNAME" = buildd -o "$$USER" = buildd ] && echo yes) + +ifneq (,$(findstring nocheck, $(DEB_BUILD_OPTIONS))) + WITHOUT_CHECK := yes +endif +WITHOUT_BENCH := +ifneq (,$(findstring nobench, $(DEB_BUILD_OPTIONS))) + WITHOUT_BENCH := yes +endif +ifeq ($(on_buildd),yes) + ifneq (,$(filter $(DEB_HOST_ARCH), armel hppa mips mipsel mips64 mips64el s390 hurd-i386 kfreebsd-amd64 kfreebsd-i386)) + WITHOUT_CHECK := yes + endif + ifneq (,$(filter $(DEB_HOST_ARCH), armel hppa mips mipsel mips64 mips64el s390 hurd-i386 kfreebsd-amd64 kfreebsd-i386)) + WITHOUT_BENCH := yes + WITHOUT_STONE := yes + endif +endif +ifneq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + WITHOUT_BENCH := yes + WITHOUT_CHECK := yes + WITHOUT_STONE := yes +endif + +COMMA = , +ifneq (,$(filter parallel=%,$(subst $(COMMA), ,$(DEB_BUILD_OPTIONS)))) + NJOBS := -j $(subst parallel=,,$(filter parallel=%,$(subst $(COMMA), ,$(DEB_BUILD_OPTIONS)))) +endif + +distribution := $(shell lsb_release -is) +distrelease := $(shell lsb_release -cs) + +VER=3.4 +SVER=3.4~b1 +NVER=3.5 +PVER=python3.4 +PRIORITY=$(shell echo $(VER) | tr -d '.')0 + +PREVVER := $(shell awk '/^python/ && NR > 1 {print substr($$2,2,length($$2)-2); exit}' debian/changelog) + +# default versions are built from the python-defaults source package +# keep the definition to adjust package priorities. +DEFAULT_VERSION = no +STATIC_PYTHON=yes + +MIN_MODS := $(shell awk '/^ / && $$2 == "module" { print $$1 }' \ + debian/PVER-minimal.README.Debian.in) +MIN_EXTS := $(shell awk '/^ / && $$2 ~ /^extension/ { print $$1 }' \ + debian/PVER-minimal.README.Debian.in) +MIN_BUILTINS := $(shell awk '/^ / && $$2 == "builtin" { print $$1 }' \ + debian/PVER-minimal.README.Debian.in) +MIN_PACKAGES := $(shell awk '/^ / && $$2 == "package" { print $$1 }' \ + debian/PVER-minimal.README.Debian.in) +MIN_ENCODINGS := $(foreach i, \ + $(filter-out \ + big5% bz2% cp932.py cp949.py cp950.py euc_% \ + gb% iso2022% johab.py shift_jis% , \ + $(shell cd Lib/encodings && echo *.py)), \ + encodings/$(i)) \ + codecs.py stringprep.py + +with_tk := no +with_interp := static +#with_interp := shared + +PY_INTERPRETER = /usr/bin/python$(VER) + +ifeq ($(DEFAULT_VERSION),yes) + PY_PRIO = standard + #PYSTDDEP = , python (>= $(VER)) +else + PY_PRIO = optional +endif +ifeq ($(distribution),Ubuntu) + PY_MINPRIO = required + PY_MINPRIO = optional + with_fpectl = yes + #with_udeb = yes +else + PY_MINPRIO = $(PY_PRIO) + with_fpectl = yes +endif +ifeq (,$(filter $(distrelease),lenny etch squeeze wheezy lucid maverick natty oneiric)) + bd_qual = :any +endif +ifeq (,$(filter $(distrelease),lenny etch squeeze wheezy lucid maverick natty oneiric)) + ma_filter = cat +else + ma_filter = grep -v '^Multi-Arch:' +endif +ifneq (,$(filter $(distrelease),sid experimental)) + bd_gcc = gcc (>= 4:4.8.2-4), +endif + +CC=$(DEB_HOST_GNU_TYPE)-gcc +CXX=$(DEB_HOST_GNU_TYPE)-g++ + +AR=$(DEB_HOST_GNU_TYPE)-ar +RANLIB=$(DEB_HOST_GNU_TYPE)-ranlib + +DPKG_CPPFLAGS:= $(shell dpkg-buildflags --get CPPFLAGS) +DPKG_CFLAGS := $(shell dpkg-buildflags --get CFLAGS) +DPKG_LDFLAGS := $(shell dpkg-buildflags --get LDFLAGS) +OPT_CFLAGS := $(filter-out -O%,$(DPKG_CFLAGS)) # default is -O3 +DEBUG_CFLAGS := $(patsubst -O%,-O0,$(DPKG_CFLAGS)) + +# on alpha, use -O2 only, use -mieee +ifeq ($(DEB_HOST_ARCH),alpha) + OPT_CFLAGS += -mieee + DEBUG_CFLAGS += -mieee + EXTRA_OPT_FLAGS += -O2 +endif +ifeq ($(DEB_HOST_ARCH),m68k) + EXTRA_OPT_FLAGS += -O2 +endif + +ifeq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + ifeq ($(DEB_HOST_ARCH_OS),linux) + ifneq (,$(findstring $(DEB_HOST_ARCH), amd64 armel armhf i386 powerpc ppc64 ppc64el)) + with_pgo := yes + endif + endif +endif + +ifneq (,$(findstring $(DEB_HOST_ARCH), amd64 armel armhf i386 powerpc ppc64 ppc64el)) + with_lto := yes +endif + +ifneq (,$(findstring noopt, $(DEB_BUILD_OPTIONS))) + OPT_CFLAGS := $(filter-out -O%, $(OPT_CFLAGS)) + EXTRA_OPT_CFLAGS = -O0 + with_pgo = + with_lto = +endif + +ifeq ($(with_lto),yes) + LTO_CFLAGS = -g -flto -fuse-linker-plugin + with_fat_lto := $(shell dpkg --compare-versions $$($(CC) --version \ + | sed -n '/^$(CC)/s,.* ,,p') ge 4.9 && echo yes) + ifeq ($(with_fat_lto),yes) + LTO_CFLAGS += -ffat-lto-objects + endif + EXTRA_OPT_CFLAGS += $(LTO_CFLAGS) + AR=$(DEB_HOST_GNU_TYPE)-gcc-ar + RANLIB=$(DEB_HOST_GNU_TYPE)-gcc-ranlib +endif + +make_build_target = $(if $(with_pgo),profile-opt) + +buildd_static := $(CURDIR)/build-static +buildd_shared := $(CURDIR)/build-shared +buildd_debug := $(CURDIR)/build-debug +buildd_shdebug := $(CURDIR)/build-shdebug + +d := debian/tmp +scriptdir = usr/share/lib/python$(VER) +scriptdir = usr/share/python$(VER) +scriptdir = usr/lib/python$(VER) + +# package names and directories +p_base := $(PVER) +p_min := $(PVER)-minimal +p_lib := lib$(PVER) +p_tk := $(PVER)-tk +p_dev := $(PVER)-dev +p_exam := $(PVER)-examples +p_idle := idle-$(PVER) +p_doc := $(PVER)-doc +p_dbg := $(PVER)-dbg +p_udeb := $(PVER)-udeb +p_venv := $(PVER)-venv + +p_lbase := lib$(PVER)-stdlib +p_lmin := lib$(PVER)-minimal +p_ldev := lib$(PVER)-dev +p_ldbg := lib$(PVER)-dbg +p_ltst := lib$(PVER)-testsuite + +d_base := debian/$(p_base) +d_min := debian/$(p_min) +d_lib := debian/$(p_lib) +d_tk := debian/$(p_tk) +d_dev := debian/$(p_dev) +d_exam := debian/$(p_exam) +d_idle := debian/$(p_idle) +d_doc := debian/$(p_doc) +d_dbg := debian/$(p_dbg) +d_udeb := debian/$(p_udeb) +d_venv := debian/$(p_venv) + +d_lbase := debian/$(p_lbase) +d_lmin := debian/$(p_lmin) +d_ldev := debian/$(p_ldev) +d_ldbg := debian/$(p_ldbg) +d_ltst := debian/$(p_ltst) + +build-arch: stamps/stamp-build +build-indep: stamps/stamp-build-doc +build: build-arch +stamps/stamp-build: stamps/stamp-build-static stamps/stamp-mincheck \ + stamps/stamp-build-shared stamps/stamp-build-debug \ + stamps/stamp-build-shared-debug \ + stamps/stamp-check stamps/stamp-pystone stamps/stamp-pybench + touch $@ + +PROFILE_EXCLUDES = test_compiler test_distutils test_subprocess \ + test_multiprocessing test_socketserver \ + test_thread test_threaded_import test_threadedtempfile \ + test_threading test_threading_local test_threadsignals \ + test_concurrent_futures test_ctypes \ + test_dbm_dumb test_dbm_ndbm test_pydoc test_sundry \ + test_signal test_ioctl test_gdb test_ensurepip test_venv + +# FIXME: these fail in the profile build +PROFILE_EXCLUDES += \ + test_cmd_line_script test_zipimport_support + +# FIXME: update profiled-build.diff to support --next +# --next=20 +PROFILE_TASK = ../Lib/test/regrtest.py \ + -s \ + -j 1 -unone,decimal \ + -x $(sort $(TEST_EXCLUDES) $(PROFILE_EXCLUDES)) + +stamps/stamp-build-static: stamps/stamp-configure-static + dh_testdir + $(MAKE) $(NJOBS) -C $(buildd_static) \ + EXTRA_CFLAGS="$(EXTRA_OPT_CFLAGS)" \ + PROFILE_TASK='$(PROFILE_TASK)' $(make_build_target) + + : # check that things are correctly built +ifeq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + ifneq (,$(filter $(DEB_HOST_ARCH_OS), linux)) + cd $(buildd_static) && ./python -c 'from _multiprocessing import SemLock' + endif +endif + + touch stamps/stamp-build-static + +run-profile-task: + $(MAKE) -C $(buildd_static) \ + PROFILE_TASK='$(PROFILE_TASK)' run_profile_task + +stamps/stamp-build-shared: stamps/stamp-configure-shared + dh_testdir + $(MAKE) $(NJOBS) -C $(buildd_shared) \ + EXTRA_CFLAGS="$(EXTRA_OPT_CFLAGS)" + : # build a static library with PIC objects + $(MAKE) $(NJOBS) -C $(buildd_shared) \ + EXTRA_CFLAGS="$(EXTRA_OPT_CFLAGS)" \ + LIBRARY=libpython$(VER)m-pic.a libpython$(VER)m-pic.a + touch stamps/stamp-build-shared + +stamps/stamp-build-debug: stamps/stamp-configure-debug + dh_testdir + $(MAKE) $(NJOBS) -C $(buildd_debug) \ + EXTRA_CFLAGS="$(DEBUG_CFLAGS)" + touch stamps/stamp-build-debug + +stamps/stamp-build-shared-debug: stamps/stamp-configure-shared-debug + dh_testdir + : # build the shared debug library + $(MAKE) $(NJOBS) -C $(buildd_shdebug) \ + EXTRA_CFLAGS="$(DEBUG_CFLAGS)" \ + libpython$(VER)dm.so pybuilddir.txt + touch stamps/stamp-build-shared-debug + +common_configure_args = \ + --prefix=/usr \ + --enable-ipv6 \ + --enable-loadable-sqlite-extensions \ + --with-dbmliborder=bdb:gdbm \ + --with-computed-gotos \ + --without-ensurepip \ + --with-system-expat \ + --with-system-libmpdec \ + +ifneq (,$(filter $(DEB_HOST_ARCH), avr32 or1k)) + common_configure_args += --without-ffi +else + common_configure_args += --with-system-ffi +endif + +ifeq ($(with_fpectl),yes) + common_configure_args += \ + --with-fpectl +endif + +ifneq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + common_configure_args += --host=$(DEB_HOST_GNU_TYPE) --build=$(DEB_BUILD_GNU_TYPE) + config_site = ac_cv_file__dev_ptmx=yes ac_cv_file__dev_ptc=yes + ifeq (,$(filter $(DEB_HOST_ARCH),arm m68k)) + ifeq ($(DEB_HOST_ARCH_ENDIAN),little) + config_site += ac_cv_little_endian_double=yes + else + config_site += ac_cv_big_endian_double=yes + endif + endif +endif + +stamps/stamp-configure-shared: stamps/stamp-patch + rm -rf $(buildd_shared) + mkdir -p $(buildd_shared) + cd $(buildd_shared) && \ + CC="$(CC)" CXX="$(CXX)" AR="$(AR)" RANLIB="$(RANLIB)" CFLAGS="$(OPT_CFLAGS)" \ + CPPFLAGS="$(DPKG_CPPFLAGS)" LDFLAGS="$(DPKG_LDFLAGS)" \ + $(config_site) \ + ../configure \ + --enable-shared \ + $(common_configure_args) + + $(call __post_configure,$(buildd_shared)) + + @echo XXXXXXX pyconfig.h + -cat $(buildd_shared)/pyconfig.h + + touch $@ + +stamps/stamp-configure-static: stamps/stamp-patch + rm -rf $(buildd_static) + mkdir -p $(buildd_static) + cd $(buildd_static) && \ + CC="$(CC)" CXX="$(CXX)" AR="$(AR)" RANLIB="$(RANLIB)" CFLAGS="$(OPT_CFLAGS)" \ + CPPFLAGS="$(DPKG_CPPFLAGS)" LDFLAGS="$(DPKG_LDFLAGS)" \ + $(config_site) \ + ../configure \ + $(common_configure_args) + + $(call __post_configure,$(buildd_static)) + touch $@ + +stamps/stamp-configure-debug: stamps/stamp-patch + rm -rf $(buildd_debug) + mkdir -p $(buildd_debug) + cd $(buildd_debug) && \ + CC="$(CC)" CXX="$(CXX)" AR="$(AR)" RANLIB="$(RANLIB)" CFLAGS="$(DEBUG_CFLAGS)" \ + CPPFLAGS="$(DPKG_CPPFLAGS)" LDFLAGS="$(DPKG_LDFLAGS)" \ + $(config_site) \ + ../configure \ + $(common_configure_args) \ + --with-pydebug + + $(call __post_configure,$(buildd_debug)) + touch $@ + +stamps/stamp-configure-shared-debug: stamps/stamp-patch + rm -rf $(buildd_shdebug) + mkdir -p $(buildd_shdebug) + cd $(buildd_shdebug) && \ + CC="$(CC)" CXX="$(CXX)" AR="$(AR)" RANLIB="$(RANLIB)" CFLAGS="$(DEBUG_CFLAGS)" \ + CPPFLAGS="$(DPKG_CPPFLAGS)" LDFLAGS="$(DPKG_LDFLAGS)" \ + $(config_site) \ + ../configure \ + $(common_configure_args) \ + --enable-shared \ + --with-pydebug + + $(call __post_configure,$(buildd_shdebug)) + touch $@ + +define __post_configure + egrep \ + "^#($$(awk -v ORS='|' '$$2 ~ /^extension$$/ {print $$1}' debian/PVER-minimal.README.Debian.in)XX)" \ + Modules/Setup.dist \ + | sed -e 's/^#//' -e 's/-Wl,-Bdynamic//;s/-Wl,-Bstatic//' \ + >> $(1)/Modules/Setup.local + + : # unconditionally run makesetup + cd $(1) && \ + ../Modules/makesetup -c ../Modules/config.c.in -s Modules \ + Modules/Setup.config Modules/Setup.local Modules/Setup + mv $(1)/config.c $(1)/Modules/ + + : # and fix the timestamps + $(MAKE) -C $(1) Makefile Modules/config.c +endef + +stamps/stamp-mincheck: stamps/stamp-build-static debian/PVER-minimal.README.Debian.in +ifeq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + for m in $(MIN_MODS) $(MIN_PACKAGES) $(MIN_EXTS) $(MIN_BUILTINS); do \ + echo "import $$m"; \ + done > $(buildd_static)/minmods.py + cd $(buildd_static) && ./python ../debian/pymindeps.py minmods.py \ + > $(buildd_static)/mindeps.txt + -if [ -x /usr/bin/dot ]; then \ + cd $(buildd_static) && ./python ../debian/depgraph.py \ + < $(buildd_static)/mindeps.txt > $(buildd_static)/mindeps.dot; \ + dot -Tpng -o $(buildd_static)/mindeps.png \ + $(buildd_static)/mindeps.dot; \ + else true; fi + cd $(buildd_static) && ./python ../debian/mincheck.py \ + minmods.py mindeps.txt +endif + touch stamps/stamp-mincheck + +TEST_RESOURCES = all +ifeq ($(on_buildd),yes) + TEST_RESOURCES := $(TEST_RESOURCES),-network,-urlfetch +endif +TESTOPTS = -j 1 -w -u$(TEST_RESOURCES) +TEST_EXCLUDES = +TEST_EXCLUDES += test_ensurepip test_venv +ifeq ($(on_buildd),yes) + TEST_EXCLUDES += test_tcl test_codecmaps_cn test_codecmaps_hk \ + test_codecmaps_jp test_codecmaps_kr test_codecmaps_tw \ + test_normalization test_ossaudiodev +endif +ifeq (,$(wildcard /dev/dsp)) + TEST_EXCLUDES += test_linuxaudiodev test_ossaudiodev +endif +ifneq (,$(filter $(DEB_HOST_ARCH), hppa)) + TEST_EXCLUDES += test_fork1 test_multiprocessing test_socketserver test_threading test_wait3 test_wait4 test_gdb +endif +# hangs on Aarch64, see LP: #1264354 +ifneq (,$(filter $(DEB_HOST_ARCH),arm64)) + TEST_EXCLUDES += test_faulthandler +endif +ifneq (,$(filter $(DEB_HOST_ARCH), arm avr32)) + TEST_EXCLUDES += test_ctypes +endif +ifneq (,$(filter $(DEB_HOST_ARCH), arm armel avr32 m68k)) + ifeq ($(on_buildd),yes) + TEST_EXCLUDES += test_compiler + endif +endif +ifneq (,$(filter $(DEB_HOST_ARCH), sparc sparc64)) + TEST_EXCLUDES += test_gdb +endif + +# FIXME: re-enable once fixed, see #708652 +ifneq (,$(filter $(DEB_HOST_ARCH_OS), hurd)) + TEST_EXCLUDES += test_asyncore test_curses test_exceptions \ + test_faulthandler test_imaplib test_io test_logging test_mmap \ + test_random test_signal test_socket test_socketserver test_ssl \ + test_threading test_pydoc test_runpy test_telnetlib test_tk +endif + +# FIXME: re-enable once fixed, see #708653 +ifneq (,$(filter $(DEB_HOST_ARCH_OS), kfreebsd)) + TEST_EXCLUDES += test_io test_signal test_socket test_socketserver \ + test_threading test_threadsignals test_threaded_import \ + test_time test_pty test_curses +endif + +# for debug builds only +TEST_EXCLUDES += test_gdb + +ifneq (,$(TEST_EXCLUDES)) + TESTOPTS += -x $(sort $(TEST_EXCLUDES)) +endif + +ifneq (,$(wildcard /usr/bin/localedef)) + SET_LOCPATH = LOCPATH=$(CURDIR)/locales +endif + +stamps/stamp-check: +ifeq ($(WITHOUT_CHECK),yes) + echo "check run disabled for this build" > $(buildd_static)/test_results +else + : # build locales needed by the testsuite + rm -rf locales + mkdir locales + if which localedef >/dev/null 2>&1; then \ + sh debian/locale-gen; \ + fi + + @echo ========== test environment ============ + @env + @echo ======================================== + + ifeq (,$(findstring $(DEB_HOST_ARCH), alpha)) + ( \ + echo '#! /bin/sh'; \ + echo 'set -x'; \ + echo 'export $(SET_LOCPATH)'; \ + echo '$(buildd_static)/python $(CURDIR)/debian/script.py test_results '\''make test TESTOPTS="$(filter-out test_gdb,$(TESTOPTS))"'\'; \ + echo 'echo DONE'; \ + ) > $(buildd_debug)/run_tests + chmod 755 $(buildd_debug)/run_tests + @echo "BEGIN test debug" + -tmphome=$$(mktemp -d); export HOME=$$tmphome; \ + cd $(buildd_debug) && time xvfb-run -a -e xvfb-run.log ./run_tests; \ + rm -rf $$tmphome + @echo "END test debug" + endif + + ( \ + echo '#! /bin/sh'; \ + echo 'set -x'; \ + echo 'export $(SET_LOCPATH)'; \ + echo '$(buildd_static)/python $(CURDIR)/debian/script.py test_results '\''make test EXTRA_CFLAGS="$(EXTRA_OPT_CFLAGS)" TESTOPTS="$(TESTOPTS)"'\'; \ + echo 'echo DONE'; \ + ) > $(buildd_static)/run_tests + chmod 755 $(buildd_static)/run_tests + @echo "BEGIN test static" + -tmphome=$$(mktemp -d); export HOME=$$tmphome; \ + cd $(buildd_static) && time xvfb-run -a -e xvfb-run.log ./run_tests; \ + rm -rf $$tmphome + @echo "END test static" + + ( \ + echo '#! /bin/sh'; \ + echo 'set -x'; \ + echo 'export $(SET_LOCPATH)'; \ + echo '$(buildd_static)/python $(CURDIR)/debian/script.py test_results '\''make test EXTRA_CFLAGS="$(EXTRA_OPT_CFLAGS)" TESTOPTS="$(TESTOPTS)"'\'; \ + echo 'echo DONE'; \ + ) > $(buildd_shared)/run_tests + chmod 755 $(buildd_shared)/run_tests + @echo "BEGIN test shared" + -tmphome=$$(mktemp -d); export HOME=$$tmphome; \ + cd $(buildd_shared) && time xvfb-run -a -e xvfb-run.log ./run_tests; \ + rm -rf $$tmphome + @echo "END test shared" +endif + cp -p $(buildd_static)/test_results debian/ + touch stamps/stamp-check + +stamps/stamp-pystone: +ifeq ($(WITHOUT_STONE),yes) + @echo "pystone run disabled for this build" +else + @echo "BEGIN pystone static" + cd $(buildd_static) && ./python ../Lib/test/pystone.py + cd $(buildd_static) && ./python ../Lib/test/pystone.py + @echo "END pystone static" + @echo "BEGIN pystone shared" + cd $(buildd_shared) \ + && LD_LIBRARY_PATH=. ./python ../Lib/test/pystone.py + cd $(buildd_shared) \ + && LD_LIBRARY_PATH=. ./python ../Lib/test/pystone.py + @echo "END pystone shared" + @echo "BEGIN pystone debug" + cd $(buildd_debug) && ./python ../Lib/test/pystone.py + cd $(buildd_debug) && ./python ../Lib/test/pystone.py + @echo "END pystone debug" +endif + touch stamps/stamp-pystone + +stamps/stamp-pybench: + echo "pybench run disabled for this build" > $(buildd_static)/pybench.log + +#ifeq (,$(filter $(DEB_HOST_ARCH), arm armel avr32 hppa mips mipsel mips64 mips64el m68k)) + pybench_options = -C 2 -n 5 -w 4 +#endif + +stamps/stamp-pybenchx: +ifeq ($(WITHOUT_BENCH),yes) + echo "pybench run disabled for this build" > $(buildd_static)/pybench.log +else + @echo "BEGIN pybench static" + cd $(buildd_static) \ + && time ./python ../Tools/pybench/pybench.py -f run1.pybench $(pybench_options) + cd $(buildd_static) \ + && ./python ../Tools/pybench/pybench.py -f run2.pybench -c run1.pybench $(pybench_options) + @echo "END pybench static" + @echo "BEGIN pybench shared" + cd $(buildd_shared) \ + && LD_LIBRARY_PATH=. ./python ../Tools/pybench/pybench.py -f run1.pybench $(pybench_options) + cd $(buildd_shared) \ + && LD_LIBRARY_PATH=. ./python ../Tools/pybench/pybench.py -f run2.pybench -c run1.pybench $(pybench_options) + @echo "END pybench shared" + @echo "BEGIN shared/static comparision" + $(buildd_static)/python Tools/pybench/pybench.py \ + -s $(buildd_static)/run2.pybench -c $(buildd_shared)/run2.pybench \ + | tee $(buildd_static)/pybench.log + @echo "END shared/static comparision" +endif + touch stamps/stamp-pybench + +minimal-test: + rm -rf mintest + mkdir -p mintest/lib mintest/dynlib mintest/testlib mintest/all-lib + cp -p $(buildd_static)/python mintest/ + cp -p $(foreach i,$(MIN_MODS),Lib/$(i).py) \ + mintest/lib/ + cp -a $(foreach i,$(MIN_PACKAGES),Lib/$(i)) \ + mintest/lib/ + cp -p $(wildcard $(foreach i,$(MIN_EXTS),$(buildd_static)/build/lib*/$(i).*.so)) \ + mintest/dynlib/ + cp -p Lib/unittest.py mintest/lib/ + cp -pr Lib/test mintest/lib/ + cp -pr Lib mintest/all-lib + cp -p $(buildd_static)/build/lib*/*.so mintest/all-lib/ + ( \ + echo "import sys"; \ + echo "sys.path = ["; \ + echo " '$(CURDIR)/mintest/lib',"; \ + echo " '$(CURDIR)/mintest/dynlib',"; \ + echo "]"; \ + cat Lib/test/regrtest.py; \ + ) > mintest/lib/test/mintest.py + cd mintest && ./python -E -S lib/test/mintest.py \ + -x test_codecencodings_cn test_codecencodings_hk \ + test_codecencodings_jp test_codecencodings_kr \ + test_codecencodings_tw test_codecs test_multibytecodec \ + +stamps/stamp-doc-html: + dh_testdir + $(MAKE) -C Doc html + touch stamps/stamp-doc-html + +build-doc: stamps/stamp-patch stamps/stamp-build-doc +stamps/stamp-build-doc: stamps/stamp-doc-html + touch stamps/stamp-build-doc + +control-file: + sed -e "s/@PVER@/$(PVER)/g" \ + -e "s/@VER@/$(VER)/g" \ + -e "s/@PYSTDDEP@/$(PYSTDDEP)/g" \ + -e "s/@PRIO@/$(PY_PRIO)/g" \ + -e "s/@MINPRIO@/$(PY_MINPRIO)/g" \ + -e "s/@bd_qual@/$(bd_qual)/g" \ + -e "s/@bd_gcc@/$(bd_gcc)/g" \ + debian/control.in \ + $(if $(with_udeb),debian/control.udeb) \ + | $(ma_filter) \ + > debian/control.tmp +ifeq ($(distribution),Ubuntu) + ifneq (,$(findstring ubuntu, $(PKGVERSION))) + m='Ubuntu Core Developers '; \ + sed -i "/^Maintainer:/s/\(.*\)/Maintainer: $$m\nXSBC-Original-\1/" \ + debian/control.tmp + endif +endif + [ -e debian/control ] \ + && cmp -s debian/control debian/control.tmp \ + && rm -f debian/control.tmp && exit 0; \ + mv debian/control.tmp debian/control + + + +clean: control-file + dh_testdir + dh_testroot + $(MAKE) -f debian/rules unpatch + rm -rf stamps .pc + rm -f debian/test_results + + $(MAKE) -C Doc clean + sed 's/^@/#/' Makefile.pre.in | $(MAKE) -f - srcdir=. distclean + rm -rf $(buildd_static) $(buildd_shared) $(buildd_debug) $(buildd_shdebug) + find -name '*.py[co]' | xargs -r rm -f + rm -f Lib/lib2to3/*.pickle + rm -f Lib/dist-packages + rm -rf Lib/plat-$(DEB_HOST_MULTIARCH) + rm -rf locales + rm -rf $(d)-dbg + + for f in debian/*.in; do \ + f2=`echo $$f | sed "s,PVER,$(PVER),g;s/@VER@/$(VER)/g;s,\.in$$,,"`; \ + if [ $$f2 != debian/control ] && [ $$f2 != debian/source.lintian-overrides ]; then \ + rm -f $$f2; \ + fi; \ + done + + dh_clean + +stamps/stamp-control: + : # We have to prepare the various control files + + for f in debian/*.in; do \ + f2=`echo $$f | sed "s,PVER,$(PVER),g;s/@VER@/$(VER)/g;s,\.in$$,,"`; \ + if [ $$f2 != debian/control ]; then \ + sed -e "s/@PVER@/$(PVER)/g;s/@VER@/$(VER)/g;s/@SVER@/$(SVER)/g" \ + -e "s/@PRIORITY@/$(PRIORITY)/g" \ + -e "s,@SCRIPTDIR@,/$(scriptdir),g" \ + -e "s,@INFO@,$(info_docs),g" \ + -e "s,@HOST_QUAL@,:$(DEB_HOST_ARCH),g" \ + <$$f >$$f2; \ + fi; \ + done + +2to3-man: + help2man --no-info --version-string=$(VER) --no-discard-stderr \ + --name 'Python2 to Python3 converter' \ + 2to3-$(VER) > debian/2to3-3.1 + help2man --no-info --version-string=$(VER) --no-discard-stderr \ + --name 'pysetup tool' \ + pysetup$(VER) > debian/pysetup3.1 + help2man --no-info --version-string=$(VER) --no-discard-stderr \ + --name 'create virtual python environments' \ + pyvenv-$(VER) > debian/pyvenv3.1 + +install: build-arch stamps/stamp-install +stamps/stamp-install: stamps/stamp-build control-file stamps/stamp-control + dh_testdir + dh_testroot + dh_clean -k + dh_installdirs + + : # make install into tmp and subsequently move the files into + : # their packages' directories. + install -d $(d)/usr +ifeq ($(with_interp),static) + $(MAKE) -C $(buildd_static) install prefix=$(CURDIR)/$(d)/usr + sed -e '/^OPT/s,-O3,-O2,' \ + -e 's/$(LTO_CFLAGS)//g' \ + -e 's,^RUNSHARED *=.*,RUNSHARED=,' \ + -e '/BLDLIBRARY/s/-L\. //' \ + $(buildd_shared)/$(shell cat $(buildd_shared)/pybuilddir.txt)/_sysconfigdata.py \ + > $(d)/$(scriptdir)/_sysconfigdata.py +else + $(MAKE) -C $(buildd_shared) install prefix=$(CURDIR)/$(d)/usr +endif + mkdir -p $(d)/usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)m + mv $(d)/usr/include/$(PVER)m/pyconfig.h \ + $(d)/usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)m/. + rm -f $(d)/$(scriptdir)/lib-dynload/*.py + sed -i 's/ -O3 / -O2 /g;s/$(LTO_CFLAGS)//g;s/-fprofile-use *-fprofile-correction//g' \ + $(d)/$(scriptdir)/_sysconfigdata.py + mv $(d)/$(scriptdir)/_sysconfigdata.py \ + $(d)/$(scriptdir)/plat-$(DEB_HOST_MULTIARCH)/_sysconfigdata_m.py + cp -p debian/_sysconfigdata.py $(d)/$(scriptdir)/ + + -find $(d)/usr/lib/python$(VER) -name '*_failed*.so' + find $(d)/usr/lib/python$(VER) -name '*_failed*.so' | xargs -r rm -f + + for i in $(d)/$(scriptdir)/lib-dynload/*.so; do \ + b=$$(basename $$i .cpython-34m.so); \ + d=$${b}.cpython-34m-$(DEB_HOST_MULTIARCH).so; \ + mv $$i $(d)/$(scriptdir)/lib-dynload/$$d; \ + done + + mv $(d)/usr/lib/libpython*.a $(d)/usr/lib/$(DEB_HOST_MULTIARCH)/ + + mkdir -p $(d)/usr/lib/python3 + mv $(d)/usr/lib/python$(VER)/site-packages \ + $(d)/usr/lib/python3/dist-packages + rm -f $(d)/usr/lib/python3/dist-packages/README + + : # remove files, which are not packaged + rm -rf $(d)/usr/lib/python$(VER)/ctypes/macholib + rm -f $(d)/$(scriptdir)/plat-*/regen + rm -f $(d)/$(scriptdir)/lib2to3/*.pickle + rm -f $(d)/usr/share/man/man1/python3.1 + + : # cannot build it, zlib maintainer won't provide a mingw build + find $(d) -name 'wininst*.exe' | xargs -r rm -f + + : # fix some file permissions + chmod a-x $(d)/$(scriptdir)/{runpy,fractions,lib2to3/refactor,tkinter/tix}.py + chmod a-x $(d)/$(scriptdir)/test/test_pathlib.py + +# : # move manpages to new names +# if [ -d $(d)/usr/man/man1 ]; then \ +# mkdir -p $(d)/usr/share/man +# mv $(d)/usr/man/man1/* $(d)/usr/share/man/man1/; \ +# rm -rf $(d)/usr/man/; \ +# fi + + mkdir -p $(d)/usr/share/man/man1 + cp -p Misc/python.man $(d)/usr/share/man/man1/python$(VER).1 + ln -sf python$(VER).1 $(d)/usr/share/man/man1/python$(VER)m.1 + cp -p debian/pydoc.1 $(d)/usr/share/man/man1/pydoc$(VER).1 + + : # Symlinks to /usr/bin for some tools + ln -sf ../lib/python$(VER)/pdb.py $(d)/usr/bin/pdb$(VER) + cp -p debian/pdb.1 $(d)/usr/share/man/man1/pdb$(VER).1 + cp -p debian/2to3-3.1 $(d)/usr/share/man/man1/2to3-$(VER).1 + cp -p debian/pysetup3.1 $(d)/usr/share/man/man1/pysetup$(VER).1 + cp -p debian/pyvenv3.1 $(d)/usr/share/man/man1/pyvenv-$(VER).1 + + : # versioned install only + rm -f $(d)/usr/bin/{2to3,idle3,pydoc3,pysetup3,python3,python3-config} + rm -f $(d)/usr/lib/*/pkgconfig/python3.pc + + dh_installdirs -p$(p_lib) \ + usr/lib/$(DEB_HOST_MULTIARCH) \ + $(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH) \ + usr/share/doc + : # install the shared library + cp -p $(buildd_shared)/libpython$(VER)m.so.1.0 \ + $(d_lib)/usr/lib/$(DEB_HOST_MULTIARCH)/ + dh_link -p$(p_lib) \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so.1.0 \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so.1 \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so.1 \ + /$(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so.1 \ + /$(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/libpython$(VER).so + + ln -sf $(p_base) $(d_lib)/usr/share/doc/$(p_lib) + + ln -sf libpython$(VER)m.so.1 $(d)/usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.so + +ifeq ($(with_interp),shared) + : # install the statically linked runtime + install -m755 $(buildd_static)/python $(d)/usr/bin/python$(VER)-static +endif + + cp -p Tools/i18n/pygettext.py $(d)/usr/bin/pygettext$(VER) + cp -p debian/pygettext.1 $(d)/usr/share/man/man1/pygettext$(VER).1 + + : # install the Makefile of the shared python build + sed -e '/^OPT/s,-O3,-O2,' \ + -e 's/$(LTO_CFLAGS)//g' \ + -e 's,^RUNSHARED *=.*,RUNSHARED=,' \ + -e '/BLDLIBRARY/s/-L\. //' \ + $(buildd_shared)/Makefile \ + > $(d)/$(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/Makefile + + : # Move the minimal libraries into $(p_lmin). + dh_installdirs -p$(p_lmin) \ + etc/$(PVER) \ + usr/bin \ + usr/share/man/man1 \ + $(scriptdir)/lib-dynload + -cd $(d); for i in $(MIN_EXTS); do \ + test -e $(scriptdir)/lib-dynload/$$i.*.so \ + && echo $(scriptdir)/lib-dynload/$$i.*.so; \ + done + + DH_COMPAT=2 dh_movefiles -p$(p_lmin) --sourcedir=$(d) \ + $(foreach i,$(MIN_MODS),$(scriptdir)/$(i).py) \ + $(foreach i,$(MIN_PACKAGES),$(scriptdir)/$(i)) \ + $(foreach i,$(MIN_ENCODINGS),$(scriptdir)/$(i)) \ + $(scriptdir)/site.py \ + $(scriptdir)/_sysconfigdata.py \ + $(scriptdir)/plat-$(DEB_HOST_MULTIARCH)/_sysconfigdata_m.py \ + `cd $(d); for i in $(MIN_EXTS); do \ + test -e $(scriptdir)/lib-dynload/$$i.*.so \ + && echo $(scriptdir)/lib-dynload/$$i.*.so; \ + done` + ls -l $(d_lmin)/$(scriptdir)/lib-dynload/*.so + + : # Move the binary into $(p_min). + dh_installdirs -p$(p_min) \ + usr/bin \ + usr/share/man/man1 + DH_COMPAT=2 dh_movefiles -p$(p_min) --sourcedir=$(d) \ + usr/bin/python$(VER) \ + usr/bin/python$(VER)m \ + usr/share/man/man1/python$(VER).1 \ + usr/share/man/man1/python$(VER)m.1 + + rv=0; \ + for i in $(MIN_EXTS); do \ + if [ -f $(d)/$(scriptdir)/lib-dynload/$$i.so ]; then \ + echo >&2 "extension $$i not mentioned in Setup.dist"; \ + rv=1; \ + fi; \ + done; \ + exit $$rv; + + : # Install sitecustomize.py + cp -p debian/sitecustomize.py $(d_lmin)/etc/$(PVER)/ + dh_link -p$(p_lmin) \ + /etc/$(PVER)/sitecustomize.py /$(scriptdir)/sitecustomize.py + + : # Move the static library and the header files into $(p_dev). +# mv $(d)/usr/share/include/python$(VER)/* $(d)/usr/include/python$(VER)/. +# rm -rf $(d)/usr/share/include + + cp $(d)/usr/bin/$(PVER)m-config $(d)/usr/bin/$(DEB_HOST_MULTIARCH)-$(PVER)m-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config $(d)/usr/bin/$(DEB_HOST_MULTIARCH)-$(PVER)-config + + dh_installdirs -p$(p_ldev) \ + usr/bin \ + $(scriptdir) \ + usr/include \ + usr/share/man/man1 + + DH_COMPAT=2 dh_movefiles -p$(p_ldev) --sourcedir=$(d) \ + usr/bin/$(DEB_HOST_MULTIARCH)-$(PVER)*-config \ + usr/lib/python$(VER)/config-$(VER)m-$(DEB_HOST_MULTIARCH) \ + usr/include \ + usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.{a,so} \ + usr/lib/$(DEB_HOST_MULTIARCH)/pkgconfig/python-$(VER)*.pc \ + usr/lib/python$(VER)/distutils/command/wininst-*.exe + + sed 's/@subdir@/$(PVER)m/;s/@header@/pyconfig.h/' \ + debian/multiarch.h.in > $(d_ldev)/usr/include/$(PVER)m/pyconfig.h + + sed -i '/^Cflags:/s,$$, -I$${includedir}/$(DEB_HOST_MULTIARCH)/python$(VER)m,' \ + $(d_ldev)/usr/lib/$(DEB_HOST_MULTIARCH)/pkgconfig/python-$(VER).pc + + dh_link -p$(p_ldev) \ + /usr/lib/$(PVER)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/libpython$(VER)m.a \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)m.a + + cp -p $(buildd_shared)/libpython$(VER)m-pic.a \ + $(d_ldev)/usr/lib/python$(VER)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/ + + : # symlinks for the "old" include directory name + ln -sf python$(VER)m $(d_ldev)/usr/include/python$(VER) + + dh_installdirs -p$(p_dev) \ + usr/share/doc/python$(VER) \ + usr/share/man/man1 \ + $(scriptdir) \ + $(scriptdir)/doc/html + cp -p Misc/HISTORY Misc/README.valgrind Misc/gdbinit \ + debian/README.maintainers \ + debian/test_results $(buildd_static)/pybench.log \ + $(d_dev)/usr/share/doc/python$(VER)/ + + DH_COMPAT=2 dh_movefiles -p$(p_dev) --sourcedir=$(d) \ + usr/bin/python$(VER)*-config + + : # in $(p_ldev), prefix python-config with triplets + cp -p debian/python3-config.1 \ + $(d_ldev)/usr/share/man/man1/$(DEB_HOST_MULTIARCH)-$(PVER)m-config.1 + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config.1.gz \ + $(d_ldev)/usr/share/man/man1/$(DEB_HOST_MULTIARCH)-$(PVER)-config.1.gz +ifneq ($(DEB_HOST_MULTIARCH),$(DEB_HOST_GNU_TYPE)) + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config \ + $(d_ldev)/usr/bin/$(DEB_HOST_GNU_TYPE)-$(PVER)m-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config \ + $(d_ldev)/usr/bin/$(DEB_HOST_GNU_TYPE)-$(PVER)-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config.1.gz \ + $(d_ldev)/usr/share/man/man1/$(DEB_HOST_GNU_TYPE)-$(PVER)-config.1.gz + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config.1.gz \ + $(d_ldev)/usr/share/man/man1/$(DEB_HOST_GNU_TYPE)-$(PVER)m-config.1.gz +endif + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config $(d_dev)/usr/bin/$(PVER)m-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config.1.gz $(d_dev)/usr/share/man/man1/$(PVER)m-config.1.gz + + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config $(d_dev)/usr/bin/$(PVER)-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config.1.gz $(d_dev)/usr/share/man/man1/$(PVER)-config.1.gz + +ifeq ($(with_tk),yes) + : # Move the Tkinter files into $(p_tk). + dh_installdirs -p$(p_tk) \ + $(scriptdir) \ + usr/lib/python$(VER)/lib-dynload + DH_COMPAT=2 dh_movefiles -p$(p_tk) --sourcedir=$(d) \ + usr/lib/python$(VER)/lib-dynload/_tkinter*.so +endif + + : # The test framework into $(p_lbase) + DH_COMPAT=2 dh_movefiles -p$(p_lbase) --sourcedir=$(d) \ + $(scriptdir)/test/{regrtest.py,support,__init__.py,pystone.py} + + : # The complete testsuite into $(p_lbase) + DH_COMPAT=2 dh_movefiles -p$(p_ltst) --sourcedir=$(d) \ + $(scriptdir)/test \ + $(scriptdir)/ctypes/test \ + $(scriptdir)/distutils/tests \ + $(scriptdir)/lib2to3/tests \ + $(scriptdir)/sqlite3/test \ + $(scriptdir)/idlelib/idle_test \ + $(scriptdir)/tkinter/test \ + $(scriptdir)/unittest/test + : # test_ctypes fails with test_macholib.py installed + rm -f $(d_ltst)/$(scriptdir)/ctypes/test/test_macholib.py + : # test_bdist_wininst fails, '*.exe' files are not installed + rm -f $(d_ltst)/$(scriptdir)/distutils/tests/test_bdist_wininst.py + + : # fixed upstream ... + chmod -x $(d_ltst)/$(scriptdir)/test/{test_dbm_gnu,test_dbm_ndbm}.py + + : # Move the demos and tools into $(p_exam)'s doc directory + dh_installdirs -p$(p_exam) \ + usr/share/doc/python$(VER)/examples + DH_COMPAT=2 dh_movefiles -p$(p_exam) --sourcedir=$(d) \ + $(scriptdir)/turtledemo + + cp -rp Tools/* $(d_exam)/usr/share/doc/python$(VER)/examples/ + rm -rf $(d_exam)/usr/share/doc/python$(VER)/examples/Tools/{buildbot,msi} + : # XXX: We don't need rgb.txt, we'll use our own: + rm -rf $(d_exam)/usr/share/doc/python$(VER)/examples/Tools/pynche/X + + : # IDLE + mv $(d)/usr/bin/idle$(VER) $(d)/usr/bin/idle-python$(VER) + rm -f $(d)/usr/lib/python$(VER)/idlelib/idle.bat + dh_installdirs -p$(p_idle) \ + usr/bin \ + usr/share/man/man1 + DH_COMPAT=2 dh_movefiles -p$(p_idle) --sourcedir=$(d) \ + usr/bin/idle-python$(VER) + cp -p debian/idle-$(PVER).1 $(d_idle)/usr/share/man/man1/ + + : # Replace all '#!' calls to python with $(PY_INTERPRETER) + : # and make them executable + for i in `find debian -mindepth 3 -type f ! -name '*.dpatch'`; do \ + sed '1s,#!.*python[^ ]*\(.*\),#! $(PY_INTERPRETER)\1,' \ + $$i > $$i.temp; \ + if cmp --quiet $$i $$i.temp; then \ + rm -f $$i.temp; \ + else \ + mv -f $$i.temp $$i; \ + chmod 755 $$i; \ + echo "fixed interpreter: $$i"; \ + fi; \ + done + + : # Move the docs into $(p_base)'s /usr/share/doc/$(PVER) directory, + : # all other packages only have a copyright file. + dh_installdocs -p$(p_base) \ + README Misc/NEWS Misc/ACKS + ln -sf NEWS.gz $(d_base)/usr/share/doc/$(p_base)/changelog.gz + dh_installdocs --all -N$(p_base) -N$(p_dev) -N$(p_dbg) -N$(p_lib) debian/README.Debian + + : # IDLE has its own changelogs, docs... + dh_installchangelogs -p$(p_idle) Lib/idlelib/ChangeLog + dh_installdocs -p$(p_idle) Lib/idlelib/{NEWS,README,TODO,extend}.txt + + mkdir -p $(d_idle)/usr/share/applications + cp -p debian/idle.desktop \ + $(d_idle)/usr/share/applications/idle-$(PVER).desktop + + : # those packages have own README.Debian's + install -m 644 -p debian/README.$(p_base) \ + $(d_base)/usr/share/doc/$(PVER)/README.Debian + install -m 644 -p debian/README.$(p_idle) \ + $(d_idle)/usr/share/doc/$(p_idle)/README.Debian +ifeq ($(with_tk),yes) + cp -p debian/README.Tk $(d_tk)/usr/share/doc/$(p_tk)/ +endif + + : # pyvenv and ensurepip files into $(p_venv) + dh_installdirs -p$(p_venv) \ + usr/bin \ + usr/share/man/man1 \ + usr/lib/python$(VER) + dh_movefiles -p$(p_venv) \ + usr/bin/pyvenv-$(VER) \ + usr/share/man/man1/pyvenv-$(VER).1 \ + usr/lib/python$(VER)/ensurepip + + : # library files into $(p_lbase) + dh_installdirs -p$(p_lbase) \ + usr/lib + dh_movefiles -p$(p_lbase) \ + usr/lib/python$(VER) + + : # The rest goes into $(p_base) + mkdir -p $(d)/usr/lib/python3/dist-packages + (cd $(d) && tar cf - .) | (cd $(d_base) && tar xpf -) + rm -f $(d_base)/usr/bin/python + rm -f $(d_base)/usr/bin/pyvenv + + : # Install menu icon + dh_installdirs -p$(p_base) usr/share/pixmaps + cp -p debian/pylogo.xpm $(d_base)/usr/share/pixmaps/$(PVER).xpm + + : # generate binfmt file + mkdir -p $(d_min)/usr/share/binfmts +ifeq ($(DEB_HOST_GNU_TYPE),$(DEB_BUILD_GNU_TYPE)) + $(buildd_static)/python debian/mkbinfmt.py $(PVER) \ + > $(d_min)/usr/share/binfmts/$(PVER) +else + $(PVER) debian/mkbinfmt.py $(PVER) > $(d_min)/usr/share/binfmts/$(PVER) +endif + + : # desktop entry + mkdir -p $(d_base)/usr/share/applications + cp -p debian/$(PVER).desktop \ + $(d_base)/usr/share/applications/$(PVER).desktop + + : # remove some things + -find debian -name .cvsignore | xargs rm -f + -find debian -name '*.py[co]' | xargs rm -f + + : # remove empty directories, when all components are in place + -find debian ! -name lib-dynload ! -name dist-packages -type d -empty -delete + + : # install debug package + rm -rf $(d)-dbg + $(MAKE) -C $(buildd_debug) install DESTDIR=$(CURDIR)/$(d)-dbg + : # install the Makefile of the shared python debug build + sed -e '/^OPT/s,-O3,-O2,' \ + -e 's/$(LTO_CFLAGS)//g' \ + -e 's,^RUNSHARED *=.*,RUNSHARED=,' \ + -e '/BLDLIBRARY/s/-L\. //' \ + $(buildd_shdebug)/Makefile \ + > $(d)-dbg/$(scriptdir)/config-$(VER)dm-$(DEB_HOST_MULTIARCH)/Makefile + sed -e 's,^RUNSHARED *=.*,RUNSHARED=,' \ + -e '/BLDLIBRARY/s/-L\. //' \ + $(buildd_shdebug)/$(shell cat $(buildd_shdebug)/pybuilddir.txt)/_sysconfigdata.py \ + > $(d)-dbg/$(scriptdir)/_sysconfigdata.py + rm -f $(d)-dbg/$(scriptdir)/lib-dynload/_sysconfigdata.py + sed -i 's/ -O3 / -O2 /g;s/$(LTO_CFLAGS)//g;s/-fprofile-use *-fprofile-correction//g' \ + $(d)-dbg/$(scriptdir)/_sysconfigdata.py + mv $(d)-dbg/$(scriptdir)/_sysconfigdata.py \ + $(d)-dbg/$(scriptdir)/plat-$(DEB_HOST_MULTIARCH)/_sysconfigdata_dm.py + + mv $(d)-dbg/usr/lib/libpython*.a $(d)-dbg/usr/lib/$(DEB_HOST_MULTIARCH)/ + + for i in $(d)-dbg/$(scriptdir)/lib-dynload/*.so; do \ + b=$$(basename $$i .cpython-34dm.so); \ + d=$${b}.cpython-34dm-$(DEB_HOST_MULTIARCH).so; \ + mv $$i $(d)-dbg/$(scriptdir)/lib-dynload/$$d; \ + done + + dh_installdirs -p$(p_ldbg) \ + usr/bin \ + usr/share/man/man1 \ + $(scriptdir)/lib-dynload \ + $(scriptdir)/plat-$(DEB_HOST_MULTIARCH) \ + usr/include/$(PVER)dm \ + usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)dm \ + usr/lib/$(DEB_HOST_MULTIARCH)/pkgconfig + + cp -p $(d)-dbg/$(scriptdir)/lib-dynload/*.so \ + $(d_ldbg)/$(scriptdir)/lib-dynload/ + cp -p $(d)-dbg/$(scriptdir)/plat-$(DEB_HOST_MULTIARCH)/_sysconfigdata_dm.py \ + $(d_ldbg)/$(scriptdir)/plat-$(DEB_HOST_MULTIARCH)/ + cp -p $(buildd_shdebug)/libpython$(VER)dm.so.1.0 \ + $(d_ldbg)/usr/lib/$(DEB_HOST_MULTIARCH)/ + dh_link -p$(p_ldbg) \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so.1.0 \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so.1 \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so.1 \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so + sed -e '/^Libs:/s,-lpython$(VER),-lpython$(VER)dm,' \ + -e '/^Cflags:/s,$$, -I$${includedir}/$(DEB_HOST_MULTIARCH)/python$(VER)dm,' \ + $(d)-dbg/usr/lib/$(DEB_HOST_MULTIARCH)/pkgconfig/python-$(VER).pc \ + > $(d_ldbg)/usr/lib/$(DEB_HOST_MULTIARCH)/pkgconfig/python-$(VER)-dbg.pc + + dh_installdirs -p$(p_dbg) \ + usr/bin \ + usr/share/man/man1 \ + usr/share/doc/$(p_base) + cp -p Misc/SpecialBuilds.txt $(d_dbg)/usr/share/doc/$(p_base)/ + cp -p debian/$(PVER)-dbg.README.Debian \ + $(d_dbg)/usr/share/doc/$(p_base)/README.debug + cp -p $(buildd_debug)/python $(d_dbg)/usr/bin/$(PVER)dm + ln -sf python$(VER)dm $(d_dbg)/usr/bin/$(PVER)-dbg + +ifneq ($(with_tk),yes) + rm -f $(d_ldbg)/$(scriptdir)/lib-dynload/_tkinter*.so + rm -f $(d_ldbg)/usr/lib/debug/$(scriptdir)/lib-dynload/_tkinter*.so +endif +ifneq ($(with_gdbm),yes) + rm -f $(d_ldbg)/$(scriptdir)/lib-dynload/_gdbm*.so + rm -f $(d_ldbg)/usr/lib/debug/$(scriptdir)/lib-dynload/_gdbm*.so +endif + + cp -a $(d)-dbg/$(scriptdir)/config-$(VER)dm-$(DEB_HOST_MULTIARCH) \ + $(d_ldbg)/$(scriptdir)/ + dh_link -p$(p_ldbg) \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so \ + /$(scriptdir)/config-$(VER)dm-$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.so \ + /$(scriptdir)/config-$(VER)dm-$(DEB_HOST_MULTIARCH)/libpython$(VER).so \ + /$(scriptdir)/config-$(VER)dm-$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.a \ + /usr/lib/$(DEB_HOST_MULTIARCH)/libpython$(VER)dm.a + + for i in $(d_ldev)/usr/include/$(PVER)m/*; do \ + i=$$(basename $$i); \ + case $$i in pyconfig.h) continue; esac; \ + ln -sf ../$(PVER)m/$$i $(d_ldbg)/usr/include/$(PVER)dm/$$i; \ + done + cp -p $(buildd_debug)/pyconfig.h $(d_ldbg)/usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)dm/ + sed 's/@subdir@/$(PVER)dm/;s/@header@/pyconfig.h/' \ + debian/multiarch.h.in > $(d_ldbg)/usr/include/$(PVER)dm/pyconfig.h + + ln -sf $(PVER).1.gz $(d_dbg)/usr/share/man/man1/$(PVER)-dbg.1.gz + + : # in $(p_ldbg), prefix python-config with triplets + cp $(d)-dbg/usr/bin/$(PVER)dm-config \ + $(d_ldbg)/usr/bin/$(DEB_HOST_MULTIARCH)-$(PVER)dm-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config \ + $(d_ldbg)/usr/bin/$(DEB_HOST_MULTIARCH)-$(PVER)-dbg-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config.1.gz \ + $(d_ldbg)/usr/share/man/man1/$(DEB_HOST_MULTIARCH)-$(PVER)dm-config.1.gz + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)m-config.1.gz \ + $(d_ldbg)/usr/share/man/man1/$(DEB_HOST_MULTIARCH)-$(PVER)-dbg-config.1.gz +ifneq ($(DEB_HOST_MULTIARCH),$(DEB_HOST_GNU_TYPE)) + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config \ + $(d_ldbg)/usr/bin/$(DEB_HOST_GNU_TYPE)-$(PVER)dm-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-config.1.gz \ + $(d_ldbg)/usr/share/man/man1/$(DEB_HOST_GNU_TYPE)-$(PVER)dm-config.1.gz + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config \ + $(d_ldbg)/usr/bin/$(DEB_HOST_GNU_TYPE)-$(PVER)-dbg-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config.1.gz \ + $(d_ldbg)/usr/share/man/man1/$(DEB_HOST_GNU_TYPE)-$(PVER)-dbg-config.1.gz +endif + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config $(d_dbg)/usr/bin/$(PVER)dm-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)dm-config.1.gz $(d_dbg)/usr/share/man/man1/$(PVER)dm-config.1.gz + + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-dbg-config $(d_dbg)/usr/bin/$(PVER)-dbg-config + ln -sf $(DEB_HOST_MULTIARCH)-$(PVER)-dbg-config.1.gz $(d_dbg)/usr/share/man/man1/$(PVER)-dbg-config.1.gz + + : # symlinks for the "old" include / config directory names + ln -sf $(PVER)-config.1.gz $(d_dbg)/usr/share/man/man1/$(PVER)-dbg-config.1.gz + ln -sf $(PVER).1.gz $(d_dbg)/usr/share/man/man1/$(PVER)dm.1.gz + ln -sf $(PVER)-config.1.gz $(d_dbg)/usr/share/man/man1/$(PVER)dm-config.1.gz + +ifeq ($(with_udeb),yes) + : # Copy the most important files from $(p_min) into $(p_udeb). + dh_installdirs -p$(p_udeb) \ + etc/$(PVER) \ + usr/bin \ + usr/include/$(PVER)mu \ + $(scriptdir)/lib-dynload \ + $(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH) + cp -p $(d_min)/usr/bin/python$(VER) $(d_udeb)/usr/bin/ + ln -sf python$(VER)mu $(d_udeb)/usr/bin/python$(VER) + ln -sf python$(VER) $(d_udeb)/usr/bin/python3 + cp -p $(foreach i,$(MIN_MODS),$(d_min)/$(scriptdir)/$(i).py) \ + $(d_udeb)/$(scriptdir)/ + cp -a $(foreach i,$(MIN_PACKAGES),$(d_min)/$(scriptdir)/$(i)) \ + $(d_udeb)/$(scriptdir)/ + cp -p $(foreach i,$(MIN_ENCODINGS),$(d_min)/$(scriptdir)/$(i)) \ + $(d_udeb)/$(scriptdir)/ + cp -p $(d_min)/$(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/Makefile \ + $(d_udeb)/$(scriptdir)/config-$(VER)m-$(DEB_HOST_MULTIARCH)/ + cp -p $(d_min)/usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)m/pyconfig.h \ + $(d_udeb)/usr/include/$(DEB_HOST_MULTIARCH)/$(PVER)m/ + cp -p $(d_min)/$(scriptdir)/site.py $(d_udeb)/$(scriptdir)/ + cp -p debian/sitecustomize.py $(d_udeb)/etc/$(PVER)/ + dh_link -p$(p_udeb) /etc/$(PVER)/sitecustomize.py \ + /$(scriptdir)/sitecustomize.py +endif + + for i in debian/*.overrides; do \ + b=$$(basename $$i .overrides); \ + install -D -m 644 $$i debian/$$b/usr/share/lintian/overrides/$$b; \ + done + + touch stamps/stamp-install + +# Build architecture-independent files here. +binary-indep: build-indep install stamps/stamp-control + dh_testdir -i + dh_testroot -i + + : # $(p_doc) package + dh_installdirs -p$(p_doc) \ + usr/share/doc/$(p_base) \ + usr/share/doc/$(p_doc) + dh_installdocs -p$(p_doc) + cp -a Doc/build/html $(d_doc)/usr/share/doc/$(p_base)/ + rm -f $(d_doc)/usr/share/doc/$(p_base)/html/_static/jquery.js + dh_link -p$(p_doc) \ + /usr/share/doc/$(p_base)/html /usr/share/doc/$(p_doc)/html \ + /usr/share/javascript/jquery/jquery.js /usr/share/doc/$(p_base)/html/_static/jquery.js \ + /usr/share/javascript/underscore/underscore.js /usr/share/doc/$(p_base)/html/_static/underscore.js + + : # devhelp docs + cd $(buildd_static) && ./python ../debian/pyhtml2devhelp.py \ + ../$(d_doc)/usr/share/doc/$(p_base)/html index.html $(VER) \ + > ../$(d_doc)/usr/share/doc/$(p_base)/html/$(PVER).devhelp + gzip -9v $(d_doc)/usr/share/doc/$(p_base)/html/$(PVER).devhelp + dh_link -p$(p_doc) \ + /usr/share/doc/$(p_base)/html /usr/share/devhelp/books/$(PVER) + + for i in $(p_ltst); do \ + rm -rf debian/$$i/usr/share/doc/$$i; \ + ln -s $(p_base) debian/$$i/usr/share/doc/$$i; \ + done + + dh_installdebconf -i $(dh_args) + dh_installexamples -i $(dh_args) + dh_installmenu -i $(dh_args) + -dh_icons -i $(dh_args) || dh_iconcache -i $(dh_args) + dh_installchangelogs -i $(dh_args) + dh_link -i $(dh_args) + dh_compress -i $(dh_args) -X.py -X.cls -X.css -X.txt -X.json -X.js -Xobjects.inv -Xgdbinit + dh_fixperms -i $(dh_args) + + : # make python scripts starting with '#!' executable + for i in `find debian -mindepth 3 -type f ! -name '*.dpatch' ! -perm 755`; do \ + if head -1 $$i | grep -q '^#!'; then \ + chmod 755 $$i; \ + echo "make executable: $$i"; \ + fi; \ + done + -find $(d_doc) -name '*.txt' -perm 755 -exec chmod 644 {} \; + + dh_installdeb -i $(dh_args) + dh_gencontrol -i $(dh_args) + dh_md5sums -i $(dh_args) + dh_builddeb -i $(dh_args) + +# Build architecture-dependent files here. +binary-arch: build-arch install + dh_testdir -a + dh_testroot -a +# dh_installdebconf -a + dh_installexamples -a + dh_installmenu -a + -dh_icons -a || dh_iconcache -a +# dh_installmime -a + dh_installchangelogs -a + for i in $(p_dev) $(p_dbg) $(p_venv); do \ + rm -rf debian/$$i/usr/share/doc/$$i; \ + ln -s $(p_base) debian/$$i/usr/share/doc/$$i; \ + done + for i in $(p_lbase); do \ + rm -rf debian/$$i/usr/share/doc/$$i; \ + ln -s $(p_lmin) debian/$$i/usr/share/doc/$$i; \ + done + for i in $(p_ldev) $(p_ldbg) $(p_lib); do \ + rm -rf debian/$$i/usr/share/doc/$$i; \ + ln -s $(p_lbase) debian/$$i/usr/share/doc/$$i; \ + done + -find debian ! -perm -200 -print -exec chmod +w {} \; +ifneq ($(with_tk),yes) + rm -f $(d_lbase)/$(scriptdir)/lib-dynload/_tkinter*.so +endif +ifneq ($(with_gdbm),yes) + rm -f $(d_lbase)/$(scriptdir)/lib-dynload/_gdbm*.so +endif + + dh_strip -a -N$(p_dbg) -Xdebug -Xdbg --dbg-package=$(p_dbg) + cp Tools/gdb/libpython.py $(d_dbg)/usr/lib/debug/usr/bin/$(PVER)m-gdb.py + ln -sf $(PVER)m-gdb.py $(d_dbg)/usr/lib/debug/usr/bin/$(PVER)-gdb.py + ln -sf $(PVER)m-gdb.py $(d_dbg)/usr/lib/debug/usr/bin/$(PVER)dm-gdb.py + ln -sf $(PVER)m-gdb.py $(d_dbg)/usr/lib/debug/usr/bin/$(PVER)-dbg-gdb.py + ln -sf ../bin/$(PVER)m-gdb.py \ + $(d_dbg)/usr/lib/debug/usr/lib/lib$(PVER)m.so.1.0-gdb.py + ln -sf ../bin/$(PVER)m-gdb.py \ + $(d_dbg)/usr/lib/lib$(PVER)dm.so.1.0-gdb.py + dh_link -a + dh_compress -a -X.py + dh_fixperms -a + chmod 644 $(d_lmin)/$(scriptdir)/token.py + + : # make python scripts starting with '#!' executable + for i in `find debian -mindepth 3 -type f ! -name '*.dpatch' ! -perm 755`; do \ + if head -1 $$i | grep -q '^#!'; then \ + chmod 755 $$i; \ + echo "make executable: $$i"; \ + fi; \ + done + + dh_makeshlibs -p$(p_lib) -V '$(p_lib)' + dh_makeshlibs -p$(p_ldbg) -V '$(p_ldbg)' +# don't include the following symbols, found in extensions +# which either can be built as builtin or extension. + sed -ri \ + -e '/^ (PyInit_|_add_one_to_index|asdl_)/d' \ + -e '/^ (PyExpat_XML_|PyExpat_Xml)/d' \ + -e '/^ (ffi_type_|_ctypes_)/d' \ + $(d_lib)/DEBIAN/symbols $(d_ldbg)/DEBIAN/symbols + dh_installdeb -a + dh_shlibdeps -a + dep=`sed -n '/^shlibs:Depends/s/.*\(libc6[^,]*\).*/\1/p' $(d_min).substvars`; \ + echo "shlibs:Pre-Depends=$$dep" >> $(d_min).substvars + sed -i '/^shlibs:Depends/s/libc6[^,]*[, ]*//' $(d_min).substvars + dh_gencontrol -a + dh_md5sums -a + dh_builddeb -a + +# rules to patch the unpacked files in the source directory +# --------------------------------------------------------------------------- +# various rules to unpack addons and (un)apply patches. +# - patch / apply-patches +# - unpatch / reverse-patches + +patchdir = debian/patches + +old_sphinx := $(shell dpkg --compare-versions $$(dpkg -l python-sphinx | awk '/^ii *python-sphinx/ {print $$3}') lt 1 && echo yes || echo no) + +$(patchdir)/series: $(patchdir)/series.in + cpp -E \ + -D$(distribution) \ + $(if $(filter $(old_sphinx),yes),-DOLD_SPHINX) \ + -Darch_os_$(DEB_HOST_ARCH_OS) -Darch_$(DEB_HOST_ARCH) \ + -o - $(patchdir)/series.in \ + | egrep -v '^(#.*|$$)' > $(patchdir)/series + +patch-stamp: stamps/stamp-patch +patch: stamps/stamp-patch +stamps/stamp-patch: $(patchdir)/series + dh_testdir + uname -a + @echo USER=$$USER, LOGNAME=$$LOGNAME + QUILT_PATCHES=$(patchdir) quilt push -a || test $$? = 2 + rm -rf autom4te.cache configure + autoconf + mkdir -p stamps + echo ""; echo "Patches applied in this version:" > stamps/pxx + for i in $$(cat $(patchdir)/series); do \ + echo ""; echo "$$i:"; \ + sed -n 's/^# *DP: */ /p' $(patchdir)/$$i; \ + done >> stamps/pxx + + touch Parser/acceler.c Parser/grammar1.c Parser/listnode.c \ + Parser/node.c Parser/parser.c Parser/bitset.c Parser/metagrammar.c \ + Parser/firstsets.c Parser/grammar.c Parser/pgen.c + touch Objects/obmalloc.c Python/dynamic_annotations.c \ + Python/mysnprintf.c Python/pyctype.c Parser/tokenizer_pgen.c \ + Parser/printgrammar.c Parser/parsetok_pgen.c Parser/pgenmain.c + @sleep 1 + touch Grammar/Grammar + @sleep 1 + touch Include/graminit.h + @sleep 1 + touch Python/graminit.c + + ln -sf site-packages Lib/dist-packages + + mv stamps/pxx $@ + +reverse-patches: unpatch +unpatch: + QUILT_PATCHES=$(patchdir) quilt pop -a -R || test $$? = 2 + rm -f stamps/stamp-patch $(patchdir)/series + rm -rf configure autom4te.cache + +update-patches: $(patchdir)/series + export QUILT_PATCHES=$(patchdir); \ + export QUILT_REFRESH_ARGS="--no-timestamps --no-index -pab"; \ + export QUILT_DIFF_ARGS="--no-timestamps --no-index -pab"; \ + while quilt push; do quilt refresh; done + +binary: binary-indep binary-arch + +.PHONY: control-file configure build clean binary-indep binary-arch binary install + +# Local Variables: +# mode: makefile +# end: --- python3.4-3.4.1.orig/debian/script.py +++ python3.4-3.4.1/debian/script.py @@ -0,0 +1,61 @@ +#! /usr/bin/python3 + +# Copyright (C) 2012 Colin Watson . +# +# Permission is hereby granted, free of charge, to any person obtaining +# a copy of this software and associated documentation files (the +# "Software"), to deal in the Software without restriction, including +# without limitation the rights to use, copy, modify, merge, publish, +# distribute, sublicense, and/or sell copies of the Software, and to +# permit persons to whom the Software is furnished to do so, subject +# to the following conditions: +# +# The above copyright notice and this permission notice shall be included +# in all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + +"""Trivial script(1) workalike, but without reading from standard input.""" + +import os +import pty +import select +import sys + +filename = sys.argv[1] +command = sys.argv[2] + +pid, master = pty.fork() +if pid == 0: # child + os.execlp("sh", "sh", "-c", command) + +# parent +with open(filename, "wb") as logfile: + try: + while True: + rfds, _, _ = select.select([master], [], []) + if master in rfds: + data = os.read(master, 65536) + os.write(1, data) + logfile.write(data) + logfile.flush() + except (IOError, OSError): + pass + +pid, status = os.wait() +returncode = 0 +if os.WIFSIGNALED(status): + returncode = -os.WTERMSIG(status) +elif os.WIFEXITED(status): + returncode = os.WEXITSTATUS(status) +else: + # Should never happen + raise RuntimeError("Unknown child exit status!") +os.close(master) +sys.exit(returncode) --- python3.4-3.4.1.orig/debian/sitecustomize.py.in +++ python3.4-3.4.1/debian/sitecustomize.py.in @@ -0,0 +1,7 @@ +# install the apport exception handler if available +try: + import apport_python_hook +except ImportError: + pass +else: + apport_python_hook.install() --- python3.4-3.4.1.orig/debian/source.lintian-overrides +++ python3.4-3.4.1/debian/source.lintian-overrides @@ -0,0 +1,2 @@ +# generated during the build +python3.4 source: quilt-build-dep-but-no-series-file --- python3.4-3.4.1.orig/debian/source.lintian-overrides.in +++ python3.4-3.4.1/debian/source.lintian-overrides.in @@ -0,0 +1,2 @@ +# generated during the build +@PVER@ source: quilt-build-dep-but-no-series-file --- python3.4-3.4.1.orig/debian/source/format +++ python3.4-3.4.1/debian/source/format @@ -0,0 +1 @@ +1.0 --- python3.4-3.4.1.orig/debian/tests/control +++ python3.4-3.4.1/debian/tests/control @@ -0,0 +1,19 @@ +Tests: testsuite +Depends: build-essential, locales, python3.4-dev, libpython3.4-testsuite, python3-gdbm +# need to turn off apport +Restrictions: needs-root + +Tests: testsuite-dbg +Depends: build-essential, locales, python3.4-dev, python3.4-dbg, libpython3.4-testsuite, python3-gdbm-dbg, gdb +# need to turn off apport +Restrictions: needs-root + +Tests: failing-tests +Depends: build-essential, python3.4-dev, libpython3.4-testsuite, python3-gdbm +# need to turn off apport +Restrictions: needs-root allow-stderr + +Tests: failing-tests-dbg +Depends: build-essential, python3.4-dev, python3.4-dbg, libpython3.4-testsuite, python3-gdbm-dbg, gdb +# need to turn off apport +Restrictions: needs-root allow-stderr --- python3.4-3.4.1.orig/debian/tests/failing-tests +++ python3.4-3.4.1/debian/tests/failing-tests @@ -0,0 +1,94 @@ +#!/bin/sh + +set -e + +if [ "$(whoami)" = root ]; then + if [ -n "$SUDO_USER" ] && getent passwd "$SUDO_USER" > /dev/null; then + su_user="$SUDO_USER" + else + su_user=nobody + fi + + if [ -e /etc/default/apport ]; then + # stop apport + stop apport 2>/dev/null || true + sed -i '/^enabled=/s/=.*/=0/' /etc/default/apport 2>/dev/null + + if status apport | grep -q start; then + echo >&2 "apport is running. needs to be disabled before running the tests" + exit 1 + fi + fi +fi + +tmphome=$ADTTMP/home +mkdir -p $tmphome +if [ -n "$su_user" ]; then + chmod go+rx $ADTTMP + chown $su_user:nogroup $tmphome +fi +ls -la $ADTTMP + +# no root access needed after this point + +debian_dir=$(dirname $(dirname $0)) + +export LOCPATH=$(pwd)/locales +sh $debian_dir/locale-gen + +export LANG=C + +TESTPYTHON="python3.4 -W default -bb -E -R -m test" +TESTOPTS="-j 1 -w -uall,-network,-urlfetch,-gui" +TESTEXCLUSIONS= + +# test_dbm: Fails from time to time ... +TESTEXCLUSIONS="$TESTEXCLUSIONS test_dbm" + +# test_ensurepip: not yet installed, http://bugs.debian.org/732703 +# ... and then test_venv fails too +TESTEXCLUSIONS="$TESTEXCLUSIONS test_ensurepip test_venv " + +# test_zipfile: Issue 17753, requires write access to test and email.test +TESTEXCLUSIONS="$TESTEXCLUSIONS test_zipfile" + +if [ "$su_user" = nobody ]; then + log=/dev/null +else + log=testsuite.log +fi + +script=$debian_dir/script.py +echo "Running the failing tests with the standard interpreter:" +progressions= +for tst in $TESTEXCLUSIONS; do + if [ -f "$script" ]; then + cmd="HOME=$tmphome python3.4 $script \"$log\" \"$TESTPYTHON $TESTOPTS $tst\"" + else + cmd="HOME=$tmphome $TESTPYTHON $TESTOPTS $tst" + fi + + echo "Running $tst ..." + if [ "$(whoami)" = root ]; then + echo "su -s /bin/sh -c $cmd $su_user" + if su -s /bin/sh -c "$cmd" $su_user; then + progressions="$progressions $tst" + else + : + fi + else + echo "$cmd" + if eval $cmd; then + progressions="$progressions $tst" + else + : + fi + fi +done + +if [ -n "$progressions" ]; then + echo "Tests run: $TESTEXCLUSIONS" + echo "Progressions:$progressions" +fi + +exit 0 --- python3.4-3.4.1.orig/debian/tests/failing-tests-dbg +++ python3.4-3.4.1/debian/tests/failing-tests-dbg @@ -0,0 +1,94 @@ +#!/bin/sh + +set -e + +if [ "$(whoami)" = root ]; then + if [ -n "$SUDO_USER" ] && getent passwd "$SUDO_USER" > /dev/null; then + su_user="$SUDO_USER" + else + su_user=nobody + fi + + if [ -e /etc/default/apport ]; then + # stop apport + stop apport 2>/dev/null || true + sed -i '/^enabled=/s/=.*/=0/' /etc/default/apport 2>/dev/null + + if status apport | grep -q start; then + echo >&2 "apport is running. needs to be disabled before running the tests" + exit 1 + fi + fi +fi + +tmphome=$ADTTMP/home +mkdir -p $tmphome +if [ -n "$su_user" ]; then + chmod go+rx $ADTTMP + chown $su_user:nogroup $tmphome +fi +ls -la $ADTTMP + +# no root access needed after this point + +debian_dir=$(dirname $(dirname $0)) + +export LOCPATH=$(pwd)/locales +sh $debian_dir/locale-gen + +export LANG=C + +TESTPYTHON="python3.4dm -W default -bb -E -R -m test" +TESTOPTS="-j 1 -w -uall,-network,-urlfetch,-gui" +TESTEXCLUSIONS= + +# test_dbm: Fails from time to time ... +TESTEXCLUSIONS="$TESTEXCLUSIONS test_dbm" + +# test_ensurepip: not yet installed, http://bugs.debian.org/732703 +# ... and then test_venv fails too +TESTEXCLUSIONS="$TESTEXCLUSIONS test_ensurepip test_venv " + +# test_zipfile: Issue 17753, requires write access to test and email.test +TESTEXCLUSIONS="$TESTEXCLUSIONS test_zipfile" + +if [ "$su_user" = nobody ]; then + log=/dev/null +else + log=testsuite-dbg.log +fi + +script=$debian_dir/script.py +echo "Running the failing tests with the debug enabled interpreter:" +progressions= +for tst in $TESTEXCLUSIONS; do + if [ -f "$script" ]; then + cmd="HOME=$tmphome python3.4 $script \"$log\" \"$TESTPYTHON $TESTOPTS $tst\"" + else + cmd="HOME=$tmphome $TESTPYTHON $TESTOPTS $tst" + fi + + echo "Running $tst ..." + if [ "$(whoami)" = root ]; then + echo "su -s /bin/sh -c $cmd $su_user" + if su -s /bin/sh -c "$cmd" $su_user; then + progressions="$progressions $tst" + else + : + fi + else + echo "$cmd" + if eval $cmd; then + progressions="$progressions $tst" + else + : + fi + fi +done + +if [ -n "$progressions" ]; then + echo "Tests run: $TESTEXCLUSIONS" + echo "Progressions:$progressions" +fi + +exit 0 --- python3.4-3.4.1.orig/debian/tests/testsuite +++ python3.4-3.4.1/debian/tests/testsuite @@ -0,0 +1,83 @@ +#!/bin/sh + +set -e + +if [ "$(whoami)" = root ]; then + if [ -n "$SUDO_USER" ] && getent passwd "$SUDO_USER" > /dev/null; then + su_user="$SUDO_USER" + else + su_user=nobody + fi + + if [ -e /etc/default/apport ]; then + # stop apport + stop apport 2>/dev/null || true + sed -i '/^enabled=/s/=.*/=0/' /etc/default/apport 2>/dev/null + + if status apport | grep -q start; then + echo >&2 "apport is running. needs to be disabled before running the tests" + exit 1 + fi + fi +fi +tmphome=$ADTTMP/home +mkdir -p $tmphome +if [ -n "$su_user" ]; then + chmod go+rx $ADTTMP + chown $su_user:nogroup $tmphome +fi +ls -la $ADTTMP + +tmphome=$ADTTMP/home +mkdir -p $tmphome +if [ -n "$su_user" ]; then + chown $su_user $tmphome +fi + +# no root access needed after this point + +debian_dir=$(dirname $(dirname $0)) + +export LOCPATH=$(pwd)/locales +sh $debian_dir/locale-gen + +export LANG=C + +TESTPYTHON="python3.4 -W default -bb -E -R -m test" +TESTOPTS="-j 1 -w -uall,-network,-urlfetch,-gui" +TESTEXCLUSIONS="-x" + +# test_dbm: Fails from time to time ... +TESTEXCLUSIONS="$TESTEXCLUSIONS test_dbm" + +# test_ensurepip: not yet installed, http://bugs.debian.org/732703 +# ... and then test_venv fails too +TESTEXCLUSIONS="$TESTEXCLUSIONS test_ensurepip test_venv " + +# test_gdb: not run for the optimized build +TESTEXCLUSIONS="$TESTEXCLUSIONS test_gdb" + +# test_zipfile: Issue 17753, requires write access to test and email.test +TESTEXCLUSIONS="$TESTEXCLUSIONS test_zipfile" + +if [ "$su_user" = nobody ]; then + log=/dev/null +else + log=testsuite.log +fi + +script=$debian_dir/script.py +if [ -f "$script" ]; then + cmd="HOME=$tmphome python3.4 $script \"$log\" \"$TESTPYTHON $TESTOPTS $TESTEXCLUSIONS\"" +else + cmd="HOME=$tmphome $TESTPYTHON $TESTOPTS $TESTEXCLUSIONS" +fi + +echo "Running the python testsuite with the standard interpreter:" +if [ "$(whoami)" = root ]; then + echo "su -s /bin/sh -c $cmd $su_user" + su -s /bin/sh -c "$cmd" $su_user +else + echo "$cmd" + eval $cmd +fi --- python3.4-3.4.1.orig/debian/tests/testsuite-dbg +++ python3.4-3.4.1/debian/tests/testsuite-dbg @@ -0,0 +1,75 @@ +#!/bin/sh + +set -e + +if [ "$(whoami)" = root ]; then + if [ -n "$SUDO_USER" ] && getent passwd "$SUDO_USER" > /dev/null; then + su_user="$SUDO_USER" + else + su_user=nobody + fi + + if [ -e /etc/default/apport ]; then + # stop apport + stop apport 2>/dev/null || true + sed -i '/^enabled=/s/=.*/=0/' /etc/default/apport 2>/dev/null + + if status apport | grep -q start; then + echo >&2 "apport is running. needs to be disabled before running the tests" + exit 1 + fi + fi +fi + +tmphome=$ADTTMP/home +mkdir -p $tmphome +if [ -n "$su_user" ]; then + chmod go+rx $ADTTMP + chown $su_user:nogroup $tmphome +fi +ls -la $ADTTMP + +# no root access needed after this point + +debian_dir=$(dirname $(dirname $0)) + +export LOCPATH=$(pwd)/locales +sh $debian_dir/locale-gen + +export LANG=C + +TESTPYTHON="python3.4dm -W default -bb -E -R -m test" +TESTOPTS="-j 1 -w -uall,-network,-urlfetch,-gui" +TESTEXCLUSIONS="-x" + +# test_dbm: Fails from time to time ... +TESTEXCLUSIONS="$TESTEXCLUSIONS test_dbm" + +# test_ensurepip: not yet installed, http://bugs.debian.org/732703 +# ... and then test_venv fails too +TESTEXCLUSIONS="$TESTEXCLUSIONS test_ensurepip test_venv " + +# test_zipfile: Issue 17753, requires write access to test and email.test +TESTEXCLUSIONS="$TESTEXCLUSIONS test_zipfile" + +if [ "$su_user" = nobody ]; then + log=/dev/null +else + log=testsuite-dbg.log +fi + +script=$debian_dir/script.py +if [ -f "$script" ]; then + cmd="HOME=$tmphome python3.4 $script \"$log\" \"$TESTPYTHON $TESTOPTS $TESTEXCLUSIONS\"" +else + cmd="HOME=$tmphome $TESTPYTHON $TESTOPTS $TESTEXCLUSIONS" +fi + +echo "Running the python testsuite with the debug enabled interpreter:" +if [ "$(whoami)" = root ]; then + echo "su -s /bin/sh -c $cmd $su_user" + su -s /bin/sh -c "$cmd" $su_user +else + echo "$cmd" + eval $cmd +fi --- python3.4-3.4.1.orig/debian/watch +++ python3.4-3.4.1/debian/watch @@ -0,0 +1,3 @@ +version=3 +opts=dversionmangle=s/.*\+//,uversionmangle=s/([abcr]+[1-9])$/~$1/ \ + http://www.python.org/ftp/python/3\.4(\.\d)?/Python-(3\.4[.\dabcr]*)\.tgz