Skip to content

Commit b5ceb5f

Browse files
committed
Changing Key noun to be Object in storage package.
This involves changes in all documentation, renaming a module and a test module, and updating the calls and variable names in a regression test module. Fixes #544.
1 parent 1bfa469 commit b5ceb5f

21 files changed

+824
-818
lines changed

README.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -95,9 +95,9 @@ to Cloud Storage using this Client Library.
9595
import gcloud.storage
9696
bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
9797
# Then do other things...
98-
key = bucket.get_key('/remote/path/to/file.txt')
99-
print key.get_contents_as_string()
100-
key.set_contents_from_string('New contents!')
98+
object_ = bucket.get_object('/remote/path/to/file.txt')
99+
print object_.get_contents_as_string()
100+
object_.set_contents_from_string('New contents!')
101101
bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
102102
103103
Contributing

docs/_components/storage-getting-started.rst

Lines changed: 31 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Getting started with Cloud Storage
44
This tutorial focuses on using ``gcloud`` to access
55
Google Cloud Storage.
66
We'll go through the basic concepts,
7-
how to operate on buckets and keys,
7+
how to operate on buckets and objects,
88
and how to handle access control,
99
among other things.
1010

@@ -113,33 +113,28 @@ by recognizing forward-slashes (``/``)
113113
so if you want to group data into "directories",
114114
you can do that.
115115

116-
The fundamental container for a file in Cloud Storage
117-
is called an Object,
118-
however ``gcloud`` uses the term ``Key``
119-
to avoid confusion between ``object`` and ``Object``.
120-
121116
If you want to set some data,
122-
you just create a ``Key`` inside your bucket
123-
and store your data inside the key::
117+
you just create a ``Object`` inside your bucket
118+
and store your data inside it::
124119

125-
>>> key = bucket.new_key('greeting.txt')
126-
>>> key.set_contents_from_string('Hello world!')
120+
>>> object_ = bucket.new_object('greeting.txt')
121+
>>> object_.set_contents_from_string('Hello world!')
127122

128-
:func:`new_key <gcloud.storage.bucket.Bucket.new_key>`
129-
creates a :class:`Key <gcloud.storage.key.Key>` object locally
123+
:func:`new_object <gcloud.storage.bucket.Bucket.new_object>`
124+
creates a :class:`Object <gcloud.storage.object_.Object>` object locally
130125
and
131-
:func:`set_contents_from_string <gcloud.storage.key.Key.set_contents_from_string>`
132-
allows you to put a string into the key.
126+
:func:`set_contents_from_string <gcloud.storage.object_.Object.set_contents_from_string>`
127+
allows you to put a string into the object.
133128

134129
Now we can test if it worked::
135130

136-
>>> key = bucket.get_key('greeting.txt')
137-
>>> print key.get_contents_as_string()
131+
>>> object_ = bucket.get_object('greeting.txt')
132+
>>> print object_.get_contents_as_string()
138133
Hello world!
139134

140135
What if you want to save the contents to a file?
141136

142-
>>> key.get_contents_to_filename('greetings.txt')
137+
>>> object_.get_contents_to_filename('greetings.txt')
143138

144139
Then you can look at the file in a terminal::
145140

@@ -149,32 +144,32 @@ Then you can look at the file in a terminal::
149144
And what about when you're not dealing with text?
150145
That's pretty simple too::
151146

152-
>>> key = bucket.new_key('kitten.jpg')
153-
>>> key.set_contents_from_filename('kitten.jpg')
147+
>>> object_ = bucket.new_object('kitten.jpg')
148+
>>> object_.set_contents_from_filename('kitten.jpg')
154149

155150
And to test whether it worked?
156151

157-
>>> key = bucket.get_key('kitten.jpg')
158-
>>> key.get_contents_to_filename('kitten2.jpg')
152+
>>> object_ = bucket.get_object('kitten.jpg')
153+
>>> object_.get_contents_to_filename('kitten2.jpg')
159154

160155
and check if they are the same in a terminal::
161156

162157
$ diff kitten.jpg kitten2.jpg
163158

164159
Notice that we're using
165-
:func:`get_key <gcloud.storage.bucket.Bucket.get_key>`
166-
to retrieve a key we know exists remotely.
167-
If the key doesn't exist, it will return ``None``.
160+
:func:`get_object <gcloud.storage.bucket.Bucket.get_object>`
161+
to retrieve an object we know exists remotely.
162+
If the object doesn't exist, it will return ``None``.
168163

169-
.. note:: ``get_key`` is **not** retrieving the entire object's data.
164+
.. note:: ``get_object`` is **not** retrieving the entire object's data.
170165

171-
If you want to "get-or-create" the key
166+
If you want to "get-or-create" the object
172167
(that is, overwrite it if it already exists),
173-
you can use :func:`new_key <gcloud.storage.bucket.Bucket.new_key>`.
174-
However, keep in mind, the key is not created
168+
you can use :func:`new_object <gcloud.storage.bucket.Bucket.new_object>`.
169+
However, keep in mind, the object is not created
175170
until you store some data inside of it.
176171

177-
If you want to check whether a key exists,
172+
If you want to check whether an object exists,
178173
you can use the ``in`` operator in Python::
179174

180175
>>> print 'kitten.jpg' in bucket
@@ -191,17 +186,17 @@ to retrieve the bucket object::
191186

192187
>>> bucket = connection.get_bucket('my-bucket')
193188

194-
If you want to get all the keys in the bucket,
189+
If you want to get all the objects in the bucket,
195190
you can use
196-
:func:`get_all_keys <gcloud.storage.bucket.Bucket.get_all_keys>`::
191+
:func:`get_all_objects <gcloud.storage.bucket.Bucket.get_all_objects>`::
197192

198-
>>> keys = bucket.get_all_keys()
193+
>>> objects = bucket.get_all_objects()
199194

200-
However, if you're looking to iterate through the keys,
195+
However, if you're looking to iterate through the objects,
201196
you can use the bucket itself as an iterator::
202197

203-
>>> for key in bucket:
204-
... print key
198+
>>> for object_ in bucket:
199+
... print object_
205200

206201
Deleting a bucket
207202
-----------------
@@ -234,7 +229,7 @@ Managing access control
234229
-----------------------
235230

236231
Cloud storage provides fine-grained access control
237-
for both buckets and keys.
232+
for both buckets and objects.
238233
`gcloud` tries to simplify access control
239234
by working with entities and "grants".
240235
On any ACL,

docs/_components/storage-quickstart.rst

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -53,22 +53,22 @@ and instantiating the demo connection::
5353
>>> connection = demo.get_connection()
5454

5555
Once you have the connection,
56-
you can create buckets and keys::
56+
you can create buckets and objects::
5757

5858
>>> connection.get_all_buckets()
5959
[<Bucket: ...>, ...]
6060
>>> bucket = connection.create_bucket('my-new-bucket')
6161
>>> print bucket
6262
<Bucket: my-new-bucket>
63-
>>> key = bucket.new_key('my-test-file.txt')
64-
>>> print key
65-
<Key: my-new-bucket, my-test-file.txt>
66-
>>> key = key.set_contents_from_string('this is test content!')
67-
>>> print key.get_contents_as_string()
63+
>>> object_ = bucket.new_object('my-test-file.txt')
64+
>>> print object_
65+
<Object: my-new-bucket, my-test-file.txt>
66+
>>> object_ = object_.set_contents_from_string('this is test content!')
67+
>>> print object_.get_contents_as_string()
6868
'this is test content!'
69-
>>> print bucket.get_all_keys()
70-
[<Key: my-new-bucket, my-test-file.txt>]
71-
>>> key.delete()
69+
>>> print bucket.get_all_objects()
70+
[<Object: my-new-bucket, my-test-file.txt>]
71+
>>> object_.delete()
7272
>>> bucket.delete()
7373

7474
.. note::

docs/index.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
datastore-batches
1212
storage-api
1313
storage-buckets
14-
storage-keys
14+
storage-objects
1515
storage-acl
1616

1717

@@ -48,5 +48,5 @@ Cloud Storage
4848
4949
from gcloud import storage
5050
bucket = storage.get_bucket('<your-bucket-name>', '<your-project-id>')
51-
key = bucket.new_key('my-test-file.txt')
52-
key = key.upload_contents_from_string('this is test content!')
51+
object_ = bucket.new_object('my-test-file.txt')
52+
object_ = object_.upload_contents_from_string('this is test content!')

docs/storage-api.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
.. toctree::
22
:maxdepth: 0
3-
:hidden:
3+
:hidden:
44

55
Storage
66
-------

docs/storage-keys.rst

Lines changed: 0 additions & 7 deletions
This file was deleted.

docs/storage-objects.rst

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Objects
2+
~~~~~~~
3+
4+
.. automodule:: gcloud.storage.object_
5+
:members:
6+
:undoc-members:
7+
:show-inheritance:

gcloud/storage/__init__.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,9 +19,9 @@
1919
>>> import gcloud.storage
2020
>>> bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
2121
>>> # Then do other things...
22-
>>> key = bucket.get_key('/remote/path/to/file.txt')
23-
>>> print key.get_contents_as_string()
24-
>>> key.set_contents_from_string('New contents!')
22+
>>> object_ = bucket.get_object('/remote/path/to/file.txt')
23+
>>> print object_.get_contents_as_string()
24+
>>> object_.set_contents_from_string('New contents!')
2525
>>> bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')
2626
2727
The main concepts with this API are:
@@ -32,7 +32,7 @@
3232
- :class:`gcloud.storage.bucket.Bucket` which represents a particular
3333
bucket (akin to a mounted disk on a computer).
3434
35-
- :class:`gcloud.storage.key.Key` which represents a pointer to a
35+
- :class:`gcloud.storage.object_.Object` which represents a pointer to a
3636
particular entity in Cloud Storage (akin to a file path on a remote
3737
machine).
3838
"""

gcloud/storage/_helpers.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -79,11 +79,11 @@ def batch(self):
7979
... bucket.enable_versioning()
8080
... bucket.disable_website()
8181
82-
or for a key::
82+
or for an object::
8383
84-
>>> with key.batch:
85-
... key.content_type = 'image/jpeg'
86-
... key.content_encoding = 'gzip'
84+
>>> with object_.batch:
85+
... object_.content_type = 'image/jpeg'
86+
... object_.content_encoding = 'gzip'
8787
8888
Updates will be aggregated and sent as a single call to
8989
:meth:`_patch_properties` IFF the ``with`` block exits without

gcloud/storage/acl.py

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -491,15 +491,15 @@ class DefaultObjectACL(BucketACL):
491491

492492

493493
class ObjectACL(ACL):
494-
"""An ACL specifically for a key."""
494+
"""An ACL specifically for a Cloud Storage Object.
495495
496-
def __init__(self, key):
497-
"""
498-
:type key: :class:`gcloud.storage.key.Key`
499-
:param key: The key that this ACL corresponds to.
500-
"""
496+
:type object_: :class:`gcloud.storage.object_.Object`
497+
:param object_: The object that this ACL corresponds to.
498+
"""
499+
500+
def __init__(self, object_):
501501
super(ObjectACL, self).__init__()
502-
self.key = key
502+
self.object_ = object_
503503

504504
def reload(self):
505505
"""Reload the ACL data from Cloud Storage.
@@ -509,16 +509,17 @@ def reload(self):
509509
"""
510510
self.entities.clear()
511511

512-
url_path = '%s/acl' % self.key.path
513-
found = self.key.connection.api_request(method='GET', path=url_path)
512+
url_path = '%s/acl' % self.object_.path
513+
found = self.object_.connection.api_request(method='GET',
514+
path=url_path)
514515
self.loaded = True
515516
for entry in found['items']:
516517
self.add_entity(self.entity_from_dict(entry))
517518

518519
return self
519520

520521
def save(self, acl=None):
521-
"""Save the ACL data for this key.
522+
"""Save the ACL data for this object.
522523
523524
:type acl: :class:`gcloud.storage.acl.ACL`
524525
:param acl: The ACL object to save. If left blank, this will
@@ -531,8 +532,9 @@ def save(self, acl=None):
531532
save_to_backend = True
532533

533534
if save_to_backend:
534-
result = self.key.connection.api_request(
535-
method='PATCH', path=self.key.path, data={'acl': list(acl)},
535+
result = self.object_.connection.api_request(
536+
method='PATCH', path=self.object_.path,
537+
data={'acl': list(acl)},
536538
query_params={'projection': 'full'})
537539
self.entities.clear()
538540
for entry in result['acl']:
@@ -542,11 +544,11 @@ def save(self, acl=None):
542544
return self
543545

544546
def clear(self):
545-
"""Remove all ACL rules from the key.
547+
"""Remove all ACL rules from the object.
546548
547549
Note that this won't actually remove *ALL* the rules, but it
548550
will remove all the non-default rules. In short, you'll still
549-
have access to a key that you created even after you clear ACL
551+
have access to an object that you created even after you clear ACL
550552
rules with this method.
551553
"""
552554
return self.save([])

0 commit comments

Comments
 (0)