##// END OF EJS Templates
worker: Use buffered input from the pickle stream...
worker: Use buffered input from the pickle stream On Python 3, "pickle.load" will raise an exception ("_pickle.UnpicklingError: pickle data was truncated") when it gets a short read, i.e. it receives fewer bytes than it requested. On our build machine, Mercurial seems to frequently hit this problem while updating a mozilla-central clone iff it gets scheduled in batch mode. It is easy to trigger with: #wipe the workdir rm -rf * hg update null chrt -b 0 hg update default I've also written the following program, which demonstrates the core problem: from __future__ import print_function import io import os import pickle import time obj = {"a": 1, "b": 2} obj_data = pickle.dumps(obj) assert len(obj_data) > 10 rfd, wfd = os.pipe() pid = os.fork() if pid == 0: os.close(rfd) for _ in range(4): time.sleep(0.5) print("First write") os.write(wfd, obj_data[:10]) time.sleep(0.5) print("Second write") os.write(wfd, obj_data[10:]) os._exit(0) try: os.close(wfd) rfile = os.fdopen(rfd, "rb", 0) print("Reading") while True: try: obj_copy = pickle.load(rfile) assert obj == obj_copy except EOFError: break print("Success") finally: os.kill(pid, 15) The program reliably fails with Python 3.8 and succeeds with Python 2.7. Providing the unpickler with a buffered reader fixes the issue, so let "os.fdopen" create one. https://bugzilla.mozilla.org/show_bug.cgi?id=1604486 Differential Revision: https://phab.mercurial-scm.org/D8051

File last commit:

r43346:2372284d default
r44718:cb52e619 stable
Show More
indexapi.py
72 lines | 2.1 KiB | text/x-python | PythonLexer
# Infinite push
#
# Copyright 2016 Facebook, Inc.
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
from __future__ import absolute_import
class indexapi(object):
"""Class that manages access to infinitepush index.
This class is a context manager and all write operations (like
deletebookmarks, addbookmark etc) should use `with` statement:
with index:
index.deletebookmarks(...)
...
"""
def __init__(self):
"""Initializes the metadata store connection."""
def close(self):
"""Cleans up the metadata store connection."""
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def addbundle(self, bundleid, nodesctx):
"""Takes a bundleid and a list of node contexts for each node
in that bundle and records that."""
raise NotImplementedError()
def addbookmark(self, bookmark, node):
"""Takes a bookmark name and hash, and records mapping in the metadata
store."""
raise NotImplementedError()
def addmanybookmarks(self, bookmarks):
"""Takes a dict with mapping from bookmark to hash and records mapping
in the metadata store."""
raise NotImplementedError()
def deletebookmarks(self, patterns):
"""Accepts list of bookmarks and deletes them.
"""
raise NotImplementedError()
def getbundle(self, node):
"""Returns the bundleid for the bundle that contains the given node."""
raise NotImplementedError()
def getnode(self, bookmark):
"""Returns the node for the given bookmark. None if it doesn't exist."""
raise NotImplementedError()
def getbookmarks(self, query):
"""Returns bookmarks that match the query"""
raise NotImplementedError()
def saveoptionaljsonmetadata(self, node, jsonmetadata):
"""Saves optional metadata for a given node"""
raise NotImplementedError()
class indexexception(Exception):
pass