Login | Register   
RSS Feed
Download our iPhone app
Browse DevX
Sign up for e-mail newsletters from DevX

By submitting your information, you agree that devx.com may send you DevX offers via email, phone and text message, as well as email offers about other products and services that DevX believes may be of interest to you. DevX will process your information in accordance with the Quinstreet Privacy Policy.


A Developer's Guide to Python 3.0: Core Language Changes : Page 2

In this deep comparison between Python 2.x and Python 3.0, discover the far-reaching changes to the Python core language, type system, and the standard library, how they'll affect your code, and guidelines for migration.




Application Security Testing: An Integral Part of DevOps

Explore the New Type System

In Python 3.0, the old classes are gone. Python 2.2 introduced new-style classes that unified built-in types and user-defined types, but old-style classes were kept for backward compatibility purposes. In Python 2.x (where x >=2) you defined classes like this:

class OldStyleClass: pass class NewStyleClass(object): pass

In Python 3.0 you don't need to inherit from object anymore. The old-style classes were cramping Python's style, and many new features couldn't be applied to old-style classes. You'll see more about decorators, function annotations, new metaclasses, and abstract base classes in the following sections.

Abstract Base Classes

The Abstract Base Classes (ABCs) feature is part of an ongoing trend to make Python's object model richer, and provide metadata for classes. Abstract base classes let you test the capabilities of objects (such as function call parameters) reliably. Here's a quick example:

def foo(arg): assert isinstance(arg, dict) print arg['name']

The foo() function needs to access its arg argument as a dictionary, so it checks whether arg is an instance of dict—but that's limiting in many situations. Any class that defines a __getitem__ method may be accessed as a dictionary:

class A(object): def __getitem__(key): return len(key) * 'A' >>> a = A() >>> print a['123'] AAA

It is possible to test for the existence of every attribute of you plan to use, but that's both tedious and error-prone, and still might not provide assurance that you're dealing with the right object.

Abstract base classes solve this problem elegantly. You can assign an abstract class, which is a syntactic and (unenforced) semantic contract to a class, and then later test the object via isinstance() and/or issubclass(), as long as the object you are dealing with complies with the ABC.

Here's a simple example. You write a space shooter game with spaceships, planets, asteroids, missiles, lasers and what not. Of course, you need to detect collisions between various objects, and to do that you need to access multiple attributes in your code such as object position and speed, and check to determine whether the object is still "alive" after the collision. In the detect_collision() function you need to ensure that the objects being tested support all the necessary attributes and methods. Here's the function (the main logic is elided):

def detect_collision(obj_1, obj_2): assert isinstance(obj_1, MovingObject) assert isinstance(obj_2, MovingObject) # Collision detection logic follows... ...

The preceding function uses isinstance() to verify that both obj_1 and obj_2 are instances of MovingObject, which is an ABC that defines the methods and attributes that detect_Collision() needs. Here's the definition for MovingObject:

from abc import ABCMeta, abstractmethod, abstractproperty class MovingObject(metaclass=ABCMeta): @abstractmethod def is_alive(self): return self._lifeCount > 0 @abstractmethod def move(self, x, y): pass @abstractproperty def speed(self): pass def get_position(self): return self._position def set_position(self, position): self._position = position position = abstractproperty(get_position, set_position)

The code starts by importing the new module abc that contains the ABCMeta metaclass and the @abstractmethod and @abstractproperty decorators used to mark the abstract methods and properties of the ABC. Abstract methods and properties must be implemented by the class that implements the ABC contract even if the ABC provides implementation; however, the implementing class may simply call the ABC's implementation. This is similar to implemented C++ pure virtual functions (many people mistakenly believe that pure virtual functions in C++ can't have implementation).

The MovingObject class has ABCMeta as a metaclass (Note the new syntax for metaclasses in Python 3.0, covered later). The ABCMeta metaclass is not necessary for ABCs, but it makes life much easier because it has a default implementation of __instancecheck__() and __subclasscheck__() that are the key methods for ABCs. Note that you can declare read-only properties such as speed using the @abstractproperty decorator, but you must declare read-write properties such as position using the abstractproperty class.

Ok, you have a MovingObject ABC, but if you try to instantiate it, you'll get an error:

>>> MovingObject() Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: Can't instantiate abstract class MovingObject with abstract methods is_alive, move, position, speed

The error occurs because it's an abstract class. You can't instantiate a MovingObject instance, but MovingObject can serve as a base class. For example, the Spaceship class subclasses MovingObject and is thus obligated to implement all the abstract methods and properties, is_alive(), move(), speed, and position:

class Spaceship(MovingObject): def __init__(self): self._lifeCount = 1 self._speed = 5 self.position = (100, 100) def speed(self): return self._speed def get_life_count(self): return self._lifeCount def is_alive (self): return MovingObject.is_alive(self) def move(self, x, y): self.position = (self.position[0] + x, self.position[1] + y) position = property(MovingObject.get_position, MovingObject.set_position)

If your subclass doesn't implement all the abstract methods and properties it is still considered an abstract class, and therefore, if you try to instantiate it you will again get a nice error message with the names of the missing abstract methods or properties (is_alive and position in this case).

class StillAbstractMovingObject(MovingObject): def move(self): pass def speed(self): pass >>> StillAbstractMovingObject() Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: Can't instantiate abstract class StillAbstractMovingObject with abstract methods is_alive, position

One potential problem is that implemented abstract methods don't need to have the same signature as the original method in the ABC; Python verifies only the method names. That means that the ABC framework will not ensure that your subclasses fully implement the abstract contract.

Python 3.0 comes preloaded with a bunch of ABCs for containers, iterators and numbers. You'll see more about the new number type hierarchy and ABCs later. The collections module contains the ABC definitions for containers and iterators, while the numbers module defines the ABCs for number. Both modules conveniently define hierarchies of abstract classes that you can use to make sure an object supports the functionality you need. In collections for example you'll find a Mapping abstract class that all mapping (dictionary-like) objects should adhere to. Mapping subclasses the Sized, Iterable and Container abstract classes that define the abstract methods __len__, __iter__ and __contains__. The Mapping class itself (see Listing 1) defines another abstract method called __getitem__, which implements the abstract __contains__ method. So, at the end of the day Mapping has the following unimplemented abstract methods (stored in the __abstractmethods__ attribute): __getitem__, __iter__ and __len__. Concrete instantiable mapping objects must implement all three methods.

Classes don't always have to subclass an ABC to fulfill it and respond properly to isinstance() and issubclass() calls. You can also register classes and built-in types. For example, the built-in dict class subclasses only objects yet still returns True for the following queries:

>>> issubclass(dict, collections.Mapping) True >>> issubclass(dict, collections.MutableMapping) True >>> isinstance({}, collections.Sized) True

Registration is useful if you don't want to mess around with the base classes list of your classes. You should be careful when you register classes with ABCs, because registered classes always return True for issubclass() and isinstance() checks—even if the registered class does not actually implement the required abstract methods or properties. For example, the Spaceship class is registered as a Sequence even though it doesn't implement the required __len__ and __getitem__ abstract methods:

s = Spaceship() assert isinstance(s, collections.Sequence) == False collections.Sequence.register(Spaceship) assert isinstance(s, collections.Sequence) == True assert hasattr(s, '__len__') == False assert hasattr(s, '__getitem__') == False

Author's Note: Read PEP-3119 for further details.

Class Decorators

Class decorators let you modify a class when it is first declared by passing the class to a function that may manipulate it by adding /removing/modifying methods, attributes, properties, base classes, etc. This is particularly useful when you need to apply some modifications to a set of classes (not necessarily all derived from a common base class).

In the following example the dump() function is a class decorator that takes its input class and adds a method called dump_methods() to it. The dump_methods() method scans the dictionary of its class and prints all the callable methods with a nice title. Note that it returns the cls input class. You'll use this decorator class to instantiate objects in a minute:

def dump(cls): def dump_methods(self): cls = type(self) s = cls.__name__ + ' methods' print (s + '\n' + '-' * len(s)) for k,v in cls.__dict__.items(): # The callable() builtin has been removed in Python 3.0 if hasattr(v, '__call__'): print (k) print () # Attch the dump_methods nested function as a method to the input class cls.dump_methods = dump_methods return cls

Now, that you have a class decorator here's an example of decorating a couple of classes. The classes A and B are pretty boring, but serve to demonstrate the class decorator in action:

@dump class A: def foo(): pass def bar(): pass @dump class B: def baz(): pass

Both classes A and B are decorated by @dump, meaning you can call dump_methods() on their instances:

A().dump_methods() B().dump_methods() A methods --------- bar dump_methods foo B methods --------- dump_methods baz

As you can see, dump_methods() itself shows up in the output, because decorator-added methods are indistinguishable from the original methods.

Class decorators are a natural extension to function and method decorators. During the Python 2.4 design process (when function and method decorators were introduced into the language) they seemed redundant, because metaclasses provided a very similar capability. The main distinction (other than syntax) is that metaclasses are inherited. In other words, when you applied a metaclass to a class, all that class's subclasses inherit the metaclass. In contrast, class decorators don't affect sub-classes. Just to complicate things, you can compose class decorators with only a single metaclass (things get complicated if you inherit from multiple base classes, each with its own metaclass, but in the end only one metaclass will prevail). I find the mechanics of decorators much simpler to understand than metaclasses, and it's nice to have uniform syntax and semantics for class, function and method decorators.

For more information, read PEP-3129.

Comment and Contribute






(Maximum characters: 1200). You have 1200 characters left.



We have made updates to our Privacy Policy to reflect the implementation of the General Data Protection Regulation.
Thanks for your registration, follow us on our social networks to keep up-to-date