Run scour for a batch of svgs

Asked by Vish on 2011-03-12

For humanity, we have been saving our files as plain svg since it was the option we had at that time.
However, with scour the svg file sizes are even smaller.

We would like to convert all the icons to this new optimized format, and we have nearly 2000 icons , but seems that scour can convert only one svg at a time, and does not take multiple svg as input.

We have the files saved in multiple folders, is there a way to run scour on all the svg files at once?

Question information

English Edit question
Ubuntu scour Edit question
No assignee Edit question
Solved by:
Louis Simard
Last query:
Last reply:

install imagemagick and mogrify

Are all te files in one folder or are they in different folders and I can write you a script to convert them in one for you :)

Vish (vish) said : #2

It's for Humanity, you can check out the branch layout here >

Bug 702423 was the main reason i started looking into scour.

Right now, i already have imagemagick installed, and I have the latest branch from lp:scour
 I just drop the svg in the /scour folder and use $ python -i input.svg -o output.svg --disable-style-to-xml

Which gives me the optimized svg.
But thats for a single file
I'v tried to mention 2 files but it doesnt work that way..

I've tried to look at pitti's, this is what he uses for all packages :

But got lost there.. :D

I'd just need a script which only runs scour with the --disable-style-to-xml option.
Thanks for taking the time :)

You can make a script like:

for f in `find $LOCATION -iname "*.svg"; do
   python -i $f -o $f_out --disable-style-to-xml

Save the file and mark it as executable, it will convert the files and make similarly named files with _out suffixes (as you can see from the code)

I suggest you copy some of the files you intend to manipulate to test, so that you can test. If its ok then let it rip :)

Scour is not reliable enough to use on input files to replace them.

On a Unix command line, you may issue this command to optimise all .svg files in the current directory recursively, to .opt.svg files:

find . -name "*.svg" -type f -print0 | xargs -0 -I file sh -c 'F="file"; F="${F%.svg}.opt.svg"; /PATH/TO/ --OPTIONS -i "file" -o "$F"'

or to .opt-svg files:

find . -name "*.svg" -type f -print0 | xargs -0 -I file /PATH/TO/ --OPTIONS -i "file" -o "file-opt"

In either case, /PATH/TO/ is only used if is not in your path, and points to --OPTIONS is there to place your scouring options, like --shorten-ids --enable-id-stripping.

Vish (vish) said : #5

Thanks Louis Simard, that solved my question.

Vish (vish) said : #6

Thanks Louis and actionparsnip..

Right now, I'm using scour to save in a different folder, saves me the renaming later ;-)
I'm using this command:
$ find . -name "*.svg" -type f -print0 | xargs -0 -I file sh -c 'D=~/Scoured; ~/PATH/TO/ --disable-style-to-xml -i "file" -o "$Dfile"'

One nick-picking issue though it that the folder has to be named "~/Scoured." not sure why that "." is required.

The "." after find means "start searching from the pwd ([p]resent [w]orking [d]irectory). You can specify a path there if you wish instead of using "." but you want the command to execute from where you launch it (most times anyway)

Bash scripting is HUGELY powerful and is where a LOT of the power in Linux exists. Most users thing using terminal is "old fashioned" but they are severely missing out what their OS can actually do for them to make life a whole bunch easier (as you have seen).

I suggest you archive this script and pass it to friends if they need the same thing, or to users who ask for it in forums / support etc :)

Glad you got the gold

Vish (vish) said : #8

> actionparsnip posted a new comment:
> The "." after find means ...

Ah! thanks for the explanation, else i would have been banging my head, about what i was doing wrong :-)

> I suggest you archive this script and pass it to friends if they need
> the same thing, or to users who ask for it in forums / support etc :)

Yup, one of the main reasons i posted it above is to make sure it is publicly archived.. :-)