소스 검색

Resolve race condition

Sometimes, some messages were being executed at the same time, meaning
that the status wasn't being overwritten, it was displaying on a
separate line for both doing and done messages.

Rather than trying to have both sets of statuses being written out
concurrently, we write out all of the doing messages first. Then
the done messages are written out/updated, as they are completed.

Signed-off-by: Mazz Mosley <[email protected]>
Mazz Mosley 10 년 전
부모
커밋
61787fecea
1개의 변경된 파일9개의 추가작업 그리고 1개의 파일을 삭제
  1. 9 1
      compose/utils.py

+ 9 - 1
compose/utils.py

@@ -21,8 +21,10 @@ def parallel_execute(command, containers, doing_msg, done_msg, **options):
     stream = codecs.getwriter('utf-8')(sys.stdout)
     lines = []
 
-    def container_command_execute(container, command, **options):
+    for container in containers:
         write_out_msg(stream, lines, container.name, doing_msg)
+
+    def container_command_execute(container, command, **options):
         return getattr(container, command)(**options)
 
     with concurrent.futures.ThreadPoolExecutor(max_workers=max_workers) as executor:
@@ -41,6 +43,10 @@ def parallel_execute(command, containers, doing_msg, done_msg, **options):
 
 
 def write_out_msg(stream, lines, container_name, msg):
+    """
+    Using special ANSI code characters we can write out the msg over the top of
+    a previous status message, if it exists.
+    """
     if container_name in lines:
         position = lines.index(container_name)
         diff = len(lines) - position
@@ -56,6 +62,8 @@ def write_out_msg(stream, lines, container_name, msg):
         lines.append(container_name)
         stream.write("{}: {}... \r\n".format(container_name, msg))
 
+    stream.flush()
+
 
 def json_hash(obj):
     dump = json.dumps(obj, sort_keys=True, separators=(',', ':'))