torch.Storage¶
A torch.Storage
is a contiguous, one-dimensional array of a single
data type.
Every torch.Tensor
has a corresponding storage of the same data type.
-
class
torch.
FloatStorage
[source]¶ -
byte
()¶ Casts this storage to byte type
-
char
()¶ Casts this storage to char type
-
clone
()¶ Returns a copy of this storage
-
copy_
()¶
-
cpu
()¶ Returns a CPU copy of this storage if it’s not already on the CPU
-
cuda
(device=None, async=False)¶ Returns a copy of this object in CUDA memory.
If this object is already in CUDA memory and on the correct device, then no copy is performed and the original object is returned.
Parameters:
-
data_ptr
()¶
-
double
()¶ Casts this storage to double type
-
element_size
()¶
-
fill_
()¶
-
float
()¶ Casts this storage to float type
-
from_buffer
()¶
-
half
()¶ Casts this storage to half type
-
int
()¶ Casts this storage to int type
-
is_cuda
= False¶
-
is_pinned
()¶
-
is_sparse
= False¶
-
long
()¶ Casts this storage to long type
-
new
()¶
-
pin_memory
()¶ Copies the storage to pinned memory, if it’s not already pinned.
-
resize_
()¶
Moves the storage to shared memory.
This is a no-op for storages already in shared memory and for CUDA storages, which do not need to be moved for sharing across processes. Storages in shared memory cannot be resized.
Returns: self
-
short
()¶ Casts this storage to short type
-
size
()¶
-
tolist
()¶ Returns a list containing the elements of this storage
-
type
(new_type=None, async=False)¶ Casts this object to the specified type.
If this is already of the correct type, no copy is performed and the original object is returned.
Parameters:
-