Latex with libreoffice (figures and tables)

Previous post here.

The issue with tables is that, unlike the normal flow of the latex document, they are completely foreign to someone who is not a latex user. My proposal, though still a bit unfriendly on the eyes (because a text table is not as pretty as a pdf one) allows my collaborator to get and edit the text.

My approach assumes that you have your tables and figures in separate files and you import them to your main latex with the `input` command. The main idea is to transform the figure and table files into text and then automatically insert them in the main latex document where they are `input`ed. The result is the same latex document except for the figures and tables which are in text format.

I use `catdvi` to do the latex->text formatting. I also use `sed`, `echo`, `cat` and `latex`. The following is the Makefile which I am using to do stuff. You should change the first three variable to the names of your main document, figures and tables files. Then just run `make -f NameOfFile`. You might also need to add different targets depending on what you need to compile your latex figure and table documents.

maindoc = paper.tex
figdoc = paperfig.tex
tabdoc = papertab.tex
TXTs = $(figdoc).txt $(tabdoc).txt
$(maindoc).txt: $(TXTs)
  cat $(maindoc) > $(maindoc).txt
  for tfile in $(subst .txt,,$(TXTs)); do \
    sed -i -e "/input{$$tfile}/{r $$tfile.txt" -e "d}" $(maindoc).txt; \
  rm -rf $(TXTs)
  echo '\\documentclass{article}' > $@.tex
  cat $(maindoc) | grep usepackage >> $@.tex
  echo '\\begin{document}' >> $@.tex
  cat $(subst .txt,,$@) >> $@.tex
  echo '\\end{document}' >> $@.tex
  latex $@.tex
  catdvi $@.dvi > $@

Notice that this code will not include the actual figures into the txt document. I guess I’ll be doing that by hand. But with 5 to 10 figures should not be too time consuming. Also, the script does not have tab spaces, so Make might complain (I blame wordpress).

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

Latex with Libreoffice (proposal)

Situation: I think latex is a better way of writing scientific manuscripts and I would like to use it for that purpose. Some prefer to do it with MS word or Libreoffice. This post is about a work-flow that will allow me to write stuff in latex and collaborate with colleagues that use Libreoffice. Caveat: I am the main author of the manuscript that I’m writing and this work-flow is tailored with that mind.

I did not want to translate my latex doc into an intermediate format and then give it to my collaborators because it would be difficult to incorporate any changes done in that format back into my tex file. I’m using git to track all changes and would like to receive diffs against my original doc. I discarded pushing latex down my collaborators throats because they are already busy as it is :). I need a solution that allows my collaborators to make changes against the main latex file with what they are using for writing.


1. Give them my tex file.

2. Tell them to run the following macro so they can easily distinguish between latex stuff and content stuff.

Option Explicit
Public oDocAuto as Object
Sub Main()
Dim oSearchDesc as Object
Dim oFoundall as Object
Dim xFound as Object
oDocAuto = ThisComponent
oSearchDesc = oDocAuto.createsearchDescriptor()
oSearchDesc.SearchRegularExpression = True
oSearchDesc.SearchWords = False
oSearchDesc.SearchString = "\\[^ %][a-zA-Z0-9 {1}\.\[\]\-]*\{|\}|\$|\\hline|\\ |tbl:[a-zA-Z0-9]*\}|fig:[a-zA-Z0-9]*\}|\\circ|\\\\|\\[a-zA-Z]*$"
oFoundall = oDocAuto.FindAll(oSearchDesc)
xFound = oDocAuto.findFirst(oSearchDesc)
do while not IsNull(xFound)
  xFound.CharColor = 16711680
  xFound = oDocAuto.findNext( xFound.End, oSearchDesc )
End Sub

3. Tell my collaborators that they should edit as if they were writing a normal document and make clear to them to save to a text file (as opposed to odf or doc). I’ll also make it clear that they are to ignore the red text (macro paints common latex stuff in red)

4. If they edit and save in text format, I should be able to diff the resulting document with git and manually pic individual parts of the diff as separate commits.

I don’t have clear how to handle tables and figures.

Posted in PhD | Tagged , , , | Leave a comment

EcoIP is out!!!

The EcoIP code has been available since the start of 2012 (approx.). It has been changing and will continue to do so as long as I am involved in automation of ecological measurements. Today marks a milestone in the project as the paper describing it has been published in Ecological Informatics. You can find it here. This is very gratifying for me and serves a validation that we might be heading in the right direction by trying to automate ecological measurements.

Some example figures:

Images from the James Reserve. (A) Canopy of deciduous oak (Quercus sp.). (B) Close-up of deciduous oak (Quercus sp.). (C) Bracken ferns (P. aqualinum). (D) Yellow meadow wallflowers (E. capitatum).

Fig. 1. EcoIP data processing work-flow begins with creating the ITS. Models and signals are created with input from image series. Model creation is iterative. Data used for model creation is ignored in signal generation.

Posted in PhD | Tagged , , | Leave a comment

Image metadata is important!!!

I want to use some images taken a long time ago and my life would be much easier if I had as much metadata for each image as possible. The only metadata that I currently have is image size, file size, file name and file type. What I really need is focal length, lens characteristics, aperture, exposure times. I’m not talking about getting a list as big as my Nikon has, I just want the basics.

Note to self, for future camera projects, buy one that gives you some basic metadata!!!!

Posted in Uncategorized | Leave a comment

Define default values for bash scrip arguments

Note to self: bash is cool!!!!

I was made aware that you can define a default value for a variable in the following manner:


So if parameter is unset or null, the expression is replaced with the expansion of ‘word’. I went searching the bash man page for more information and it has lots of other goodies:

${parameter:-word} : word replaces the expression when parameter is unset or null.
${parameter:=word}: word is assigned to parameter when parameter is unset or null.
${parameter:?word}: word is output to stderr if parameter is unset or null.
${parameter:+word}: replace parameter with word when parameter is null or unset.
${parameter:offset:length}: A substring of parameter is expanded.

For more specifics of how these work check out the bash man page under parameter expansion section.

Posted in commands | Tagged , , , , | Leave a comment

Creating an R package

I recently put all the EcoIP application into an R package. I was surprised at how easy it was and I wanted to share my experience :)

Root Path

Everything that is to be packaged needs to be in a staging directory. It will contain all the source files and the meta-files that describe the package. In my case, my root path was EcoIP:

$ ls EcoIP

DESCRIPTION and NAMESPACE are files and the rest are directories. Lets see these one by one…


This file contains information about the package: version, name, author…. Here is my DESCRIPTION file:

Package: EcoIP
Type: Package
Title: Ecological Image Processing
Version: 0.1-20120726
Date: 20120726
Author: Joel Andres Granados <>
Maintainer: Joel Andres Granados <>
Description: EcoIP detects phenological phases based on image series.
License: GPL-3
Depends: EBImage,digest,fields,RSVGTipsDevice

Though the description file has more meta variables, these are arguably the main ones. Or, at least, the ones I used :).  All of them are pretty self-explanatory.


This is the file that contains what is to be exported into the R namespace when the package is ‘imported’ with the `library` command. Here is my NAMESPACE file:


Here there is the list of all the R files included in EcoIP. Each of these files has lots of functions that can potentially be exported into the R namespace. With this file you explicitly tell R what functions you want exported and the rest are just hidden. Here you put everything that you want the user to execute.

R Directory

Here is were all the R source files go. I did not have any other type of file so I did not need to use other directories. But R requires other directories if you are planning to use other languages like Java or C. Here is the contents of my R directory.

$ ls R
colorTrans.R common.R ecoip.R imageTrans.R naiveBayes.R

Something worth mentioning is that I had to change my `source` calls. Before creating the package I expected the source files to be in a certain place, after the package the source files will be in the relative path that starts at R/… So if I wanted to ‘source’ a file I would have to do it like this:


Other than this, the files did not suffer any other package related changes.

The man directory

Here is where all the manual pages are located. Something that I have noticed in the past with R package is that they are very well documented. The reason for this is that R provides a handy command that practically creates the package : `package.skeleton`. In my case I only used it to create the manual pages, but that is just me :). You can use the command to create the whole package skeleton:

package.skeleton("EcoIP", code_files=c("colorTrans.R", "common.R", "ecoip.R", "imageTrans.R", "naiveBayes.R"))

The command will create the root path and all its contents. I only used it generate the documentation templates since they were really helpful when creating the documentation. In my case I only had to fill in the explanations for each function argument, give some general information about the function and that is it. The stuff you don’t need from the template, you can just erase. Since all my exported functions were on just one file, I generated the documentation templates with the following command:

package.skeleton("EcoIP", code_files=c("ecoip.R"))

Then I went and fished for the template and changed it to my liking. Here is the output of the man directory.

$ ls man
EcoIP-package.Rd eip.genOutput.Rd eip.histcmp.Rd eip.nbm.Rd eip.plot.Rd eip.showModel.Rd

The inst directory

In my case I had some data that I wanted to include with my package. It was a bit tricky y since it was not in the regular format that R expected (csv file, Rdata, tar.gz…). I had to put it in a place where the user would be able to find it. I found that putting it in inst/extdata did the trick. The data that I included was some images and the way that the user can access these images is by using the `system.file` command. So if I wanted to list the contents of the inst/extdata directory I would execute this command in the R environment:

list.files(system.file("extdata", package="EcoIP"))

Creating the package

Once you have your root path ready, you can run the following commands:

R CMD build ROOT

This command creates a tar.gz file. To check that the package is ‘sane’, you should run the following command:

R CMD check ROOT.tar.gz


I had some issues with when I first tried to check the package. The were due to my code trying to source files from places different from the R directory. To avoid this source your R source files like if they were all in an R directory.

I also had some issues where the package could only be versioned by two numbers: mayor.minor. Didn’t quite find a way around this problem.


I found that EBImage’s way of installation is really cool. I created a file on my source to mimic the behavior of EBImage. So now all you have to do to install EcoIP into an R environment is execute two commands:

Posted in R | Tagged , , , , , , , , | Leave a comment

GPG related links

I have recently been drawn back to mutt. Along with the migration, I would like to start signing my mails again. I was using the gmail web mail and did not spend time figuring out how to sign mails. This post is to remind me of cool gpg links that I found along the way. Feel free to comment if you know of a link that is not here :)

General gpg information

gpg and mutt

Moving gpg keys

Posted in Uncategorized | Tagged , , , | Leave a comment