Welcome, guest | Sign In | My Account | Store | Cart

Allows arbitrary number of commands to be strung together with each one feeding into the next ones input. Syntax is simple: x=pipe("cmd1", "cmd2", "cmd3").read() is equivalent to bash command x=cmd1 | cmd2 | cmd3.

Works under python 2.4.2

Python, 71 lines
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
#!/usr/bin/env python

import os
import subprocess
import sys
import StringIO

def cmdtolist(str):
    cmdlist=[]
    current=[]
    gc=''
    last=''
    for c in str:
        if (c == '\\'):
            pass
        elif (last == '\\'):
            current.append(c)
        elif (gc != ''):
            if (c != gc):
                current.append(c)
            else:
                gc=''
        else:
            if (c.isspace()):
                cmdlist.append(''.join(current))
                current = []
            elif (c == '"'):
                gc = c
            elif (c == "'"):
                gc = c
            else:
                current.append(c)
        last = c
    if (len(current) != 0):
        cmdlist.append(''.join(current))
    return cmdlist

def pipe(*cmds):
    def func():
        pass
    def norm(cmd):
        if (isinstance(cmd, str)):
            return cmdtolist(cmd)
        return cmd
    def pipeobj(cmd, stdin=None):
        if (callable(cmd)):
            fp = Fpipe(cmd, stdin)
            fp.call()
            fp.stdout.seek(0)
            return fp
        if (stdin is None):
            return subprocess.Popen(norm(cmd), stdout=subprocess.PIPE)
        else:
            return subprocess.Popen(norm(cmd), stdin=stdin, stdout=subprocess.PIPE)
    if (len(cmds) == 0):
        return
    prev = None
    for cmd in cmds:
        if (prev is None):
            prev = pipeobj(cmd)
        else:
            prev = pipeobj(cmd, stdin=prev.stdout)
    return prev.stdout

class Fpipe:
    def __init__(self, fn, stdin=None):
        self.fn = fn
        self.stdin = stdin
        self.stdout = StringIO.StringIO()
    def call(self):
        self.fn(self.stdin, self.stdout)

I wrote this because I wanted a convenient way to write bash like scripts in python. Occasionally there is some external command that needs to be invoked (such as 'jar' in my case) from within python. This makes it easier to handle and process these commands. Try the following:

import pipe

for i in pipe.pipe("cat /etc/passwd", "grep /bin/bash"): print i

to print all lines with '/bin/bash' from file '/etc/passwd'. I went a little further than just allowing strings to be issued as commands though. I thought wouldn't it be great if you could slot in python functions along the way to do processing in python? Simple:

import pipe import re

def grepper(fin, fout): ..for line in fin: ....if (re.search('/bin/bash', line)): ......fout.write(line)

for i in pipe.pipe("cat /etc/passwd", grepper): print i

To take it even further you could do the following:

import pipe import re

def grepper(fin, fout): ..for line in fin: ....if (re.search('/bin/bash', line)): ......fout.write(line)

def catter(fin, fout): ..f = file('/etc/passwd') ....for line in f: ......fout.write(line) ..f.close()

for i in pipe.pipe(catter, grepper): print i

You can intermix python functions with script utility software and apps that rely on communicating via standard io.

Pipe arguments are also very flexible, note that you can pass pipe a list of the already separated arguments for each command:

dirlist=pipe(["ls", "-al"])

is equivalent to:

dirlist=pipe("ls -al")

I've also included use of quotes in pipe so that:

dirlist=pipe("ls -al 'a funny name'")

will give the three arguments ["ls", "-al", "a funny name"]. You can also use backslash to quote a quote.

There are defects with the implementation. Currently the python functions commands do not behave as true processes in a pipe - they are all executed from the same python process. This means they have to be called during the building the building of the pipeline to evaluate whole stream that goes into the function. Perhaps a better way would be to use blocking threads.

A further convenience function could be written to build a callable object that encapsulates arguments for the pipe function:

def grepper2(fin, fout, args): ...

capture(grepper2, ["-v", "/bin/bash"]))

capture() returns a callable object which accepts fin and fout and has args stored as an instance.

3 comments

Maxim Krikun 18 years, 1 month ago  # | flag
George Benko (author) 18 years, 1 month ago  # | flag

I've seen that code before. The only problem is very time you need a to include a new utility you need to write the function for it.

Christopher Dunn 16 years, 3 months ago  # | flag

stderr. I like the basic concept. Any idea how to trap stderr from all these? What if there are all subprocesses -- would that make trapping stderr any easier?