如何在Python中获得对象在内存中占用的大小?
用系统就行了。sys模块中定义的Getsizeof函数。
sys.getsizeof(object[, default]): Return the size of an object in bytes. The object can be any type of object. All built-in objects will return correct results, but this does not have to hold true for third-party extensions as it is implementation specific. Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to. The default argument allows to define a value which will be returned if the object type does not provide means to retrieve the size and would cause a TypeError. getsizeof calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector. See recursive sizeof recipe for an example of using getsizeof() recursively to find the size of containers and all their contents.
python 3.0中的用法示例:
>>> import sys
>>> x = 2
>>> sys.getsizeof(x)
24
>>> sys.getsizeof(sys.getsizeof)
32
>>> sys.getsizeof('this')
38
>>> sys.getsizeof('this also')
48
如果你在python < 2.6并且没有sys. exe。Getsizeof可以使用这个扩展模块。但从来没用过。
这可能比看起来要复杂得多,这取决于你想要如何计数。例如,如果您有一个int类型的列表,您是否需要包含对int类型引用的列表的大小?(即-列表,而不是包含在其中的内容),或者你想包括实际指向的数据,在这种情况下,你需要处理重复引用,以及如何防止重复计数当两个对象包含对同一对象的引用时。
您可能想要查看python内存分析器之一,例如pysizer,以查看它们是否满足您的需求。
对于numpy数组,getsizeof不起作用-对我来说,它总是出于某种原因返回40:
from pylab import *
from sys import getsizeof
A = rand(10)
B = rand(10000)
然后(在ipython中):
In [64]: getsizeof(A)
Out[64]: 40
In [65]: getsizeof(B)
Out[65]: 40
不过令人高兴的是,:
In [66]: A.nbytes
Out[66]: 80
In [67]: B.nbytes
Out[67]: 80000
下面是我根据之前对所有变量的列表大小的回答编写的一个快速脚本
for i in dir():
print (i, sys.getsizeof(eval(i)) )
在Python中如何确定对象的大小?
答案是:“用sys就行了。“Getsizeof”并不是一个完整的答案。
这个答案确实直接适用于内置对象,但它没有解释这些对象可能包含什么,特别是包含什么类型,例如自定义对象、元组、列表、字典和集。它们可以包含彼此的实例,也可以包含数字、字符串和其他对象。
一个更完整的答案
使用来自Anaconda发行版的64位Python 3.6,并使用sys。getsizeof,我已经确定了以下对象的最小大小,并注意set和dicts预先分配空间,因此空的对象直到设定的数量(可能因语言的实现而异)后才会再次增长:
Python 3:
Empty
Bytes type scaling notes
28 int +4 bytes about every 30 powers of 2
37 bytes +1 byte per additional byte
49 str +1-4 per additional character (depending on max width)
48 tuple +8 per additional item
64 list +8 for each additional
224 set 5th increases to 736; 21nd, 2272; 85th, 8416; 341, 32992
240 dict 6th increases to 368; 22nd, 1184; 43rd, 2280; 86th, 4704; 171st, 9320
136 func def does not include default args and other attrs
1056 class def no slots
56 class inst has a __dict__ attr, same scaling as dict above
888 class def with slots
16 __slots__ seems to store in mutable tuple-like structure
first slot grows to 48, and so on.
你如何理解这一点?假设你有一套包含10件物品的套装。如果每一项都是100字节,那么整个数据结构有多大?这个集合本身是736,因为它的大小是736字节。然后加上项的大小,总共是1736字节
函数和类定义的一些注意事项:
注意每个类定义都有一个用于类attrs的代理__dict__(48字节)结构。每个槽在类定义中都有一个描述符(类似于属性)。
插槽实例的第一个元素开始时为48字节,然后每个元素增加8字节。只有空的插槽对象才有16个字节,没有数据的实例没有什么意义。
此外,每个函数定义都有代码对象、文档字符串和其他可能的属性,甚至__dict__。
还要注意,我们使用sys.getsizeof()是因为我们关心边际空间的使用,其中包括来自文档的对象的垃圾收集开销:
Getsizeof()调用对象的__sizeof__方法并添加一个 对象管理的附加垃圾回收器开销 垃圾收集器。
还需要注意的是,调整列表的大小(例如重复地添加到列表中)会导致它们预先分配空间,类似于set和dicts。listobj.c源代码:
/* This over-allocates proportional to the list size, making room
* for additional growth. The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
* Note: new_allocated won't overflow because the largest possible value
* is PY_SSIZE_T_MAX * (9 / 8) + 6 which always fits in a size_t.
*/
new_allocated = (size_t)newsize + (newsize >> 3) + (newsize < 9 ? 3 : 6);
历史数据
Python 2.7分析,用孔雀鱼确认。Hpy和sys.getsizeof:
Bytes type empty + scaling notes
24 int NA
28 long NA
37 str + 1 byte per additional character
52 unicode + 4 bytes per additional character
56 tuple + 8 bytes per additional item
72 list + 32 for first, 8 for each additional
232 set sixth item increases to 744; 22nd, 2280; 86th, 8424
280 dict sixth item increases to 1048; 22nd, 3352; 86th, 12568 *
120 func def does not include default args and other attrs
64 class inst has a __dict__ attr, same scaling as dict above
16 __slots__ class with slots has no dict, seems to store in
mutable tuple-like structure.
904 class def has a proxy __dict__ structure for class attrs
104 old class makes sense, less stuff, has real dict though.
注意字典(而不是集合)得到了more Python 3.6中的紧凑表示
我认为在64位机器上,每个额外的条目引用8个字节非常有意义。这8个字节指向所包含项在内存中的位置。如果我没记错的话,这4个字节是Python 2中unicode的固定宽度,但在Python 3中,str变成了宽度等于字符最大宽度的unicode。
更多关于老虎机的信息,请看这个答案。
更完整的功能
我们需要一个函数来搜索列表、元组、集、字典、obj中的元素。__dict__ ` s和obj。__slots__,以及其他我们可能还没有想到的东西。
我们想要依赖gc。get_referents来执行此搜索,因为它工作在C级(使它非常快)。缺点是get_referents可能返回冗余成员,因此需要确保不会重复计算。
类、模块和函数都是单例——它们只在内存中存在一次。我们对它们的大小不太感兴趣,因为我们对它们无能为力——它们是项目的一部分。如果它们被引用,我们就不计算了。
我们将使用类型黑名单,这样就不会将整个程序包括在大小计数中。
import sys
from types import ModuleType, FunctionType
from gc import get_referents
# Custom objects know their class.
# Function objects seem to know way too much, including modules.
# Exclude modules as well.
BLACKLIST = type, ModuleType, FunctionType
def getsize(obj):
"""sum size of object & members."""
if isinstance(obj, BLACKLIST):
raise TypeError('getsize() does not take argument of type: '+ str(type(obj)))
seen_ids = set()
size = 0
objects = [obj]
while objects:
need_referents = []
for obj in objects:
if not isinstance(obj, BLACKLIST) and id(obj) not in seen_ids:
seen_ids.add(id(obj))
size += sys.getsizeof(obj)
need_referents.append(obj)
objects = get_referents(*need_referents)
return size
与下面的白名单函数相比,大多数对象都知道如何为了垃圾收集的目的遍历自己(当我们想知道某些对象在内存中有多昂贵时,这大约是我们所寻找的。此功能由gc.get_referents使用。)然而,如果我们不小心,这项措施的范围将比我们预期的要广泛得多。
例如,函数非常了解创建它们的模块。
另一个不同之处在于,字典中作为键的字符串通常是互缩的,因此它们不会重复。检查id(key)还可以避免计算重复项,这将在下一节中执行。黑名单解决方案跳过了作为字符串的键的计数。
白名单类型,递归访问者
为了自己覆盖这些类型中的大多数,而不是依赖于gc模块,我编写了这个递归函数来尝试估计大多数Python对象的大小,包括大多数内置对象、collections模块中的类型和自定义类型(插槽或其他类型)。
这类函数对我们要计算内存使用的类型提供了更细粒度的控制,但有遗漏重要类型的危险:
import sys
from numbers import Number
from collections import deque
from collections.abc import Set, Mapping
ZERO_DEPTH_BASES = (str, bytes, Number, range, bytearray)
def getsize(obj_0):
"""Recursively iterate to sum size of object & members."""
_seen_ids = set()
def inner(obj):
obj_id = id(obj)
if obj_id in _seen_ids:
return 0
_seen_ids.add(obj_id)
size = sys.getsizeof(obj)
if isinstance(obj, ZERO_DEPTH_BASES):
pass # bypass remaining control flow and return
elif isinstance(obj, (tuple, list, Set, deque)):
size += sum(inner(i) for i in obj)
elif isinstance(obj, Mapping) or hasattr(obj, 'items'):
size += sum(inner(k) + inner(v) for k, v in getattr(obj, 'items')())
# Check for custom object instances - may subclass above too
if hasattr(obj, '__dict__'):
size += inner(vars(obj))
if hasattr(obj, '__slots__'): # can have __slots__ with __dict__
size += sum(inner(getattr(obj, s)) for s in obj.__slots__ if hasattr(obj, s))
return size
return inner(obj_0)
我很随意地测试了它(我应该进行单元测试):
>>> getsize(['a', tuple('bcd'), Foo()])
344
>>> getsize(Foo())
16
>>> getsize(tuple('bcd'))
194
>>> getsize(['a', tuple('bcd'), Foo(), {'foo': 'bar', 'baz': 'bar'}])
752
>>> getsize({'foo': 'bar', 'baz': 'bar'})
400
>>> getsize({})
280
>>> getsize({'foo':'bar'})
360
>>> getsize('foo')
40
>>> class Bar():
... def baz():
... pass
>>> getsize(Bar())
352
>>> getsize(Bar().__dict__)
280
>>> sys.getsizeof(Bar())
72
>>> getsize(Bar.__dict__)
872
>>> sys.getsizeof(Bar.__dict__)
280
这个实现分解为类定义和函数定义,因为我们不需要获取它们的所有属性,但是由于它们对于进程来说应该只在内存中存在一次,所以它们的大小实际上并不太重要。
Pympler包的asizeof模块可以做到这一点。
使用方法如下:
from pympler import asizeof
asizeof.asizeof(my_object)
不像系统。Getsizeof,它适用于你自己创建的对象。它甚至可以与numpy一起工作。
>>> asizeof.asizeof(tuple('bcd'))
200
>>> asizeof.asizeof({'foo': 'bar', 'baz': 'bar'})
400
>>> asizeof.asizeof({})
280
>>> asizeof.asizeof({'foo':'bar'})
360
>>> asizeof.asizeof('foo')
40
>>> asizeof.asizeof(Bar())
352
>>> asizeof.asizeof(Bar().__dict__)
280
>>> A = rand(10)
>>> B = rand(10000)
>>> asizeof.asizeof(A)
176
>>> asizeof.asizeof(B)
80096
正如前面提到的,
可以通过设置option code=True来包含类、函数、方法、模块等对象的(字节)代码大小。
如果你需要实时数据的其他视图,请选择Pympler
模块muppy用于在线监控Python应用程序 和模块类跟踪器提供的生命周期的离线分析 选择Python对象。
我自己也遇到过很多次这个问题,我写了一个小函数(受到@aaron-hall的回答的启发)和测试,它完成了我所期望的sys。Getsizeof to do:
https://github.com/bosswissam/pysize
如果你对背景故事感兴趣,这就是
编辑:附上下面的代码以方便参考。要查看最新的代码,请检查github链接。
import sys
def get_size(obj, seen=None):
"""Recursively finds size of objects"""
size = sys.getsizeof(obj)
if seen is None:
seen = set()
obj_id = id(obj)
if obj_id in seen:
return 0
# Important mark as seen *before* entering recursion to gracefully handle
# self-referential objects
seen.add(obj_id)
if isinstance(obj, dict):
size += sum([get_size(v, seen) for v in obj.values()])
size += sum([get_size(k, seen) for k in obj.keys()])
elif hasattr(obj, '__dict__'):
size += get_size(obj.__dict__, seen)
elif hasattr(obj, '__iter__') and not isinstance(obj, (str, bytes, bytearray)):
size += sum([get_size(i, seen) for i in obj])
return size
Python 3.8(2019年第一季度)将改变sys. js的一些结果。getsizeof, Raymond Hettinger在此宣布:
Python容器在64位版本上要小8个字节。
tuple () 48 -> 40
list [] 64 ->56
set() 224 -> 216
dict {} 240 -> 232
这是在议题33597和稻田直树(甲烷)围绕紧凑型PyGC_Head和PR 7043的工作之后
这个想法将PyGC_Head大小减少到两个单词。 目前,PyGC_Head包含三个单词;Gc_prev, gc_next和gc_refcnt。 收集时使用Gc_refcnt,用于尝试删除。 Gc_prev用于跟踪和取消跟踪。 因此,如果我们可以在试删除时避免跟踪/取消跟踪,gc_prev和gc_refcnt可以共享相同的内存空间。
参见commit d5c875b:
从PyGC_Head中移除一个Py_ssize_t成员。 所有GC跟踪的对象(例如元组,列表,dict)大小减少4或8字节。
If you don't need the exact size of the object but roughly to know how big it is, one quick (and dirty) way is to let the program run, sleep for an extended period of time, and check the memory usage (ex: Mac's activity monitor) by this particular python process. This would be effective when you are trying to find the size of one single large object in a python process. For example, I recently wanted to check the memory usage of a new data structure and compare it with that of Python's set data structure. First I wrote the elements (words from a large public domain book) to a set, then checked the size of the process, and then did the same thing with the other data structure. I found out the Python process with a set is taking twice as much memory as the new data structure. Again, you wouldn't be able to exactly say the memory used by the process is equal to the size of the object. As the size of the object gets large, this becomes close as the memory consumed by the rest of the process becomes negligible compared to the size of the object you are trying to monitor.
您可以使用下面提到的getSizeof()来确定对象的大小
import sys
str1 = "one"
int_element=5
print("Memory size of '"+str1+"' = "+str(sys.getsizeof(str1))+ " bytes")
print("Memory size of '"+ str(int_element)+"' = "+str(sys.getsizeof(int_element))+ " bytes")
如果不想包含链接(嵌套)对象的大小,请使用sys.getsizeof()。
然而,如果你想计算嵌套在列表、字典、集、元组中的子对象——通常这就是你要找的——使用如下所示的递归深层sizeof()函数:
import sys
def sizeof(obj):
size = sys.getsizeof(obj)
if isinstance(obj, dict): return size + sum(map(sizeof, obj.keys())) + sum(map(sizeof, obj.values()))
if isinstance(obj, (list, tuple, set, frozenset)): return size + sum(map(sizeof, obj))
return size
你也可以在漂亮的工具箱中找到这个函数,以及许多其他有用的一行程序:
https://github.com/mwojnars/nifty/blob/master/util.py
你可以序列化对象,以获得与对象大小密切相关的度量值:
import pickle
## let o be the object whose size you want to measure
size_estimate = len(pickle.dumps(o))
如果您想测量无法pickle的对象(例如,由于lambda表达式),dill或cloudpickle可以是一种解决方案。
我用这个技巧…May在小对象上不准确,但我认为对于复杂对象(如pygame surface)比sys.getsizeof()更准确。
import pygame as pg
import os
import psutil
import time
process = psutil.Process(os.getpid())
pg.init()
vocab = ['hello', 'me', 'you', 'she', 'he', 'they', 'we',
'should', 'why?', 'necessarily', 'do', 'that']
font = pg.font.SysFont("monospace", 100, True)
dct = {}
newMem = process.memory_info().rss # don't mind this line
Str = f'store ' + f'Nothing \tsurface use about '.expandtabs(15) + \
f'0\t bytes'.expandtabs(9) # don't mind this assignment too
usedMem = process.memory_info().rss
for word in vocab:
dct[word] = font.render(word, True, pg.Color("#000000"))
time.sleep(0.1) # wait a moment
# get total used memory of this script:
newMem = process.memory_info().rss
Str = f'store ' + f'{word}\tsurface use about '.expandtabs(15) + \
f'{newMem - usedMem}\t bytes'.expandtabs(9)
print(Str)
usedMem = newMem
在我的windows 10, python 3.7.3,输出是:
store hello surface use about 225280 bytes
store me surface use about 61440 bytes
store you surface use about 94208 bytes
store she surface use about 81920 bytes
store he surface use about 53248 bytes
store they surface use about 114688 bytes
store we surface use about 57344 bytes
store should surface use about 172032 bytes
store why? surface use about 110592 bytes
store necessarily surface use about 311296 bytes
store do surface use about 57344 bytes
store that surface use about 110592 bytes
使用以下函数获取python对象的实际大小:
import sys
import gc
def actualsize(input_obj):
memory_size = 0
ids = set()
objects = [input_obj]
while objects:
new = []
for obj in objects:
if id(obj) not in ids:
ids.add(id(obj))
memory_size += sys.getsizeof(obj)
new.append(obj)
objects = gc.get_referents(*new)
return memory_size
actualsize([1, 2, [3, 4, 5, 1]])
参考:https://towardsdatascience.com/the-strange-size-of-python-objects-in-memory-ce87bdfbb97f
推荐文章
- 有没有办法在python中做HTTP PUT
- “foo Is None”和“foo == None”之间有什么区别吗?
- 类没有对象成员
- Django模型“没有显式声明app_label”
- 熊猫能自动从CSV文件中读取日期吗?
- 在python中zip的逆函数是什么?
- 有效的方法应用多个过滤器的熊猫数据框架或系列
- 如何检索插入id后插入行在SQLite使用Python?
- 我如何在Django中添加一个CharField占位符?
- 如何在Python中获取当前执行文件的路径?
- 我如何得到“id”后插入到MySQL数据库与Python?
- 在Objective-C中@property保留,赋值,复制,非原子
- super()失败,错误:TypeError "参数1必须是类型,而不是classobj"当父不继承对象
- Python内存泄漏
- 实现嵌套字典的最佳方法是什么?