Commit caf62fbe authored by Gary Poster's avatar Gary Poster

fix several small spelling errors.

parent 7aa0330d
......@@ -206,7 +206,7 @@ set_operation(PyObject *s1, PyObject *s2,
The following ifdef works around a template/type problem
Weights are passed as integers. In particular, the weight passed by
difference is one. This works find in the int value and float value
difference is one. This works fine in the int value and float value
cases but makes no sense in the object value case. In the object
value case, we don't do merging, so we don't use the weights, so it
doesn't matter what they are.
......
......@@ -787,8 +787,8 @@ class Connection(ExportImport, object):
return self._reader.getState(p)
def setstate(self, obj):
"""Turns the ghost 'obj' into a real object by loading it's from the
database."""
"""Turns the ghost 'obj' into a real object by loading its state from
the database."""
oid = obj._p_oid
if self._opened is None:
......@@ -822,7 +822,8 @@ class Connection(ExportImport, object):
# 3. Raise ConflictError.
# Does anything actually use _p_independent()? It would simplify
# the code if we could drop support for it.
# the code if we could drop support for it.
# (BTrees.Length does.)
# There is a harmless data race with self._invalidated. A
# dict update could go on in another thread, but we don't care
......
......@@ -530,7 +530,7 @@ class FileStorage(BaseStorage.BaseStorage,
return data, h.tid
else:
# Get the data from the backpointer, but tid from
# currnt txn.
# current txn.
data = self._loadBack_impl(oid, h.back)[0]
return data, h.tid
finally:
......
......@@ -21,7 +21,7 @@ And create a persistent object in the first database:
>>> tm.commit()
First, we get a connection to the second database. We get the second
connection using the first connection's `get_connextion` method. This
connection using the first connection's `get_connection` method. This
is important. When using multiple databases, we need to make sure we
use a consistent set of connections so that the objects in the
connection caches are connected in a consistent manner.
......@@ -71,7 +71,7 @@ Databases for new objects
Objects are normally added to a database by making them reachable from
an object already in the database. This is unambiguous when there is
only one database. With modultiple databases, it is not so clear what
only one database. With multiple databases, it is not so clear what
happens. Consider:
>>> p4 = MyClass()
......
......@@ -106,11 +106,11 @@ oid
'n'
Multi-database simple object reference. The arguments consist
of a databaase name, and an object id.
of a database name, and an object id.
'm'
Multi-database persistent object reference. The arguments consist
of a databaase name, an object id, and class meta data.
of a database name, an object id, and class meta data.
The following legacy format is also supported.
......
......@@ -29,8 +29,8 @@ class MyClass_w_getnewargs(persistent.Persistent):
def test_must_use_consistent_connections():
"""
It's important to use consistent connections. References to to
separate connections to the ssme database or multi-database won't
It's important to use consistent connections. References to
separate connections to the same database or multi-database won't
work.
For example, it's tempting to open a second database using the
......
......@@ -85,7 +85,7 @@ unghostify(cPersistentObject *self)
if (self->state < 0 && self->jar) {
PyObject *r;
/* Is it ever possibly to not have a cache? */
/* Is it ever possible to not have a cache? */
if (self->cache) {
/* Create a node in the ring for this unghostified object. */
self->cache->non_ghost_count++;
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment