Sunday, October 30, 2011

SciPy India 2011 abstracts due November 2nd

The third SciPy India Conference will be held from December 4th through the 7th at the Indian Institute of Technology, Bombay (IITB) in Mumbai, Maharashtra India.

At this conference, novel applications and breakthroughs made in the pursuit of science using Python are presented. Attended by leading figures from both academia and industry, it is an excellent opportunity to experience the cutting edge of scientific software development.

The conference is followed by two days of tutorials and a code sprint, during which community experts provide training on several scientific Python packages.

We invite you to take part by submitting a talk abstract on the conference website.

Talk/Paper Submission

We solicit talks and accompanying papers (either formal academic or magazine-style articles) that discuss topics regarding scientific computing using Python, including applications, teaching, development and research. We welcome contributions from academia as well as industry.

Important Dates

  • November 2, 2011, Wednesday: Abstracts Due
  • November 7, 2011, Monday: Schedule announced
  • November 28, 2011, Monday: Proceedings paper submission due
  • December 4-5, 2011, Sunday-Monday: Conference
  • December 6-7 2011, Tuesday-Wednesday: Tutorials/Sprints

Organizers

Saturday, October 9, 2010

What's the best way to interleave two Python lists?

[NOTE:  I wrote this in January 2009, but didn't publish it.  Originally, I planned to provide a short discussion about each of the potential solutions listed below, which I never got around to doing.  Anyway I just noticed my draft and decided to go ahead and publish it without adding any more discussion. The code snippets seem fairly self explanatory.  If anyone has any comments on the various solutions, I would be very interested in hearing them.]


Until early 2009, I had to add the following site.cfg file to build numpy or scipy on my 64-bit Fedora Linux box:
[DEFAULT]
library_dirs = /usr/lib64
To make numpy aware of the default location required me to add /usr/lib64 to default_lib_dirs (which I will refer to as lib_dirs for brevity) in numpy/distutils/system_info.py.

Where do 64-bit libraries belong?

The lib64 directory is the default location for 64-bit libraries on Red Hat-based system. Unfortunately, not all Linux distributions conform to this convention; but, fortunately, most distributions that don't use lib64 as the default location for 64-bit libraries at least create a lib64 symlink pointing to whatever their default location happens to be. So it appears I can assume that if I am on a 64-bit machine, looking in lib64 before lib should work in most cases.

Since I only wanted to add the lib64 path on 64-bit machines, I changed the assignment to:
lib_dirs = libpaths(['/usr/lib'], platform_bits)
where libpaths returns ['/usr/lib'] when platform_bits is 32 and ['/usr/lib64', '/usr/lib'] when it is 64. I used the platform module to set platform_bits:
# Determine number of bits
import platform
_bits = {'32bit':32,'64bit':64}
platform_bits = _bits[platform.architecture()[0]]


An outline of the solution

So far everything has been pretty straight-forward. Now all that is left is to write libpaths.
def libpaths(paths, bits): """Return a list of library paths valid on 32 or 64 bit systems. Parameters ---------- paths : sequence A sequence of strings (typically paths) bits : int An integer, the only valid values are 32 or 64. Examples -------- >>> paths = ['/usr/lib'] >>> libpaths(paths,32) ['/usr/lib'] >>> libpaths(paths,64) ['/usr/lib64', '/usr/lib'] """ if bits not in (32, 64): raise ValueError # Handle 32bit case if bits==32: return paths # Handle 64bit case return ????


How to skin the cat?

So we finally arrive at the motivation for this post. At this point, I started thinking that if I had two equal-sized lists that there should be a simple function for interleaving the elements of the two lists to make a new list. Something like zip. But zip returns a list of tuples. After discussing this with several people (Fernando Pérez, Brian Hawthorne, and Stéfan van der Walt), we came up with several different solutions.

  • Solution 1:

from itertools import cycle paths64 = (p+'64' for p in paths) return list((x.next() for x in cycle([iter(paths),paths64])))

  • Solution 2:

def _(): for path in paths: yield path yield path+'64' return list(_())

  • Solution 3:

out = [None]*(2*len(paths)) out[::2] = paths out[1::2] = (p+'64' for p in paths) return out

  • Solution 4:

out = [] for p in paths: out.append(p) out.append(p+'64') return out

  • Solution 5:

out = [] for p in paths: out.extend([p, p+'64']) return out

  • Solution 6:

return [item for items in zip(paths, (p+'64' for p in paths)) for item in items]

  • Solution 7:

from operator import concat return reduce(concat, ([p, p+'64'] for p in paths))
I liked Solution 5 the best and it is what I used.

An itertools recipe

While we were looking for a solution, Fernando and I came up with the following recipe:

from itertools import cycle,imap def fromeach(*iters): """Take elements one at a time from each iterable, cycling them all. It returns a single iterable that stops whenever any of its arguments is exhausted. Note: it differs from roundrobin in the itertools recipes, in that roundrobin continues until all of its arguments are exhausted (for this reason roundrobin also needs more complex logic and thus has more overhead). Examples: >>> list(fromeach([1,2],[3,4])) [1, 3, 2, 4] >>> list(fromeach('ABC', 'D', 'EF')) ['A', 'D', 'E', 'B'] """ return (x.next() for x in cycle(imap(iter,iters)))

Friday, September 24, 2010

SciPy India 2010 Call for Papers

The second SciPy India Conference will be held from December 13th to 18th, 2010 at IIIT-Hyderabad.

At this conference, novel applications and breakthroughs made in the pursuit of science using Python are presented.  Attended by leading figures from both academia and industry, it is an excellent
opportunity to experience the cutting edge of scientific software development.

The conference is followed by two days of tutorials and a code sprint, during which community experts provide training on several scientific Python packages.

We invite you to take part by submitting a talk abstract on the conference website.

Talk/Paper Submission

We solicit talks and accompanying papers (either formal academic or magazine-style articles) that discuss topics regarding scientific computing using Python, including applications, teaching, development and research.  Papers are included in the peer-reviewed conference proceedings, published online.

Please note that submissions primarily aimed at the promotion of a commercial product or service will not be considered.

Important Dates
Monday, Oct. 11: Abstracts Due
Saturday, Oct. 30: Schedule announced
Tuesday, Nov. 30: Proceedings paper submission due
Monday-Tuesday, Dec. 13-14: Conference
Wednesday-Friday, Dec. 15-17: Tutorials/Sprints
Saturday, Dec. 18: Sprints

Organizers
Jarrod Millman, Neuroscience Institute, UC Berkeley, USA (Conference Co-Chair)
Prabhu Ramachandran, Department of Aerospace Engineering, IIT Bombay, 
India (Conference Co-Chair)
FOSSEE Team

Tuesday, December 15, 2009

SciPy India 2009 update

Today is the fourth day of the 2009 SciPy India conference.  Although the first SciPy conference in the US was held in 2002, 2008 was the first time the conference was held in Europe and this year is the first time the conference was held in India.  It is a sign of the growing interest in using Python for scientific computing that there are now three annual conferences.

During the SciPy 2009 conference in August, Prabhu Ramachandran spoke about the Free and Open source Software for Science and Engineering Education (FOSSEE) project he was running at IIT Bombay.  The FOSSEE project is an ambitious project to promote the use of Python in numerical computing in college curriculum.  Prabhu has an interesting post on the contributions the scientific Python community has made to the larger Python community.  FOSSEE is actually just one part of an even more ambitious $1 billion dollar (US) government program called the National Mission on Education through Information and Communication Technology.

Starting at the end of May 2009, Prabhu very quickly gathered together an amazing team that immediately created a significant amount of documentation and training materials including tutorials, audio/video demonstrations, written material, and lectures.  They've created a great two-day hands-on introductory tutorial to scientific programming with Python and have all ready conducted several of these tutorials all across India.  Now they are working on creating a couple of semester long college courses and will be offering the first one next semester at IIT Bombay. 

At the end of the SciPy 2009 conference in August, Prabhu proposed that we put together a SciPy conference in India and I immediately agreed.  Not wanting to delay, we decided to have the conference before the end of the year.  After all putting together an international scientific conference in less than four months was keeping with the overall ambition of the FOSSEE project.  As soon as Prabhu returned to Mumbai, he contacted Vimal Josef at SPACE Kerala about hosting the conference in Thiruvananthapuram.  Shortly after that we announced the first international on Scientific Computing with Python (Scipy.in 09) from December 12th to the 17th at Technopark, Thiruvananthapuram sponsored by FOSSEE, IIT Bombay and SPACE Kerala.

Once we finalized the dates for the conference, I called Travis Oliphant, the president of Enthought, and asked him to deliver the keynote address, which he quickly agreed to do.  Among his many accomplishments, Travis is one of the original authors of SciPy and the primary developer of NumPy.  David Cournapeau (one of the core NumPy and SciPy developers) and Chris Burns (one of the core developers of the neuroimaging in Python project) also agreed to deliver invited talks.

The FOSSEE and SPACE teams were invaluable in organizing the conference.  In particular, Madhusudan.C.S from the FOSSEE team worked very closely with me on the conference website and putting together the conference program.  I will write another blog post in the next day or so with a description of the actual conference.  For now, you can read a short write-up from one of the local newspapers.

Sunday, November 29, 2009

Sunday in Paris

I spent most of today working on the SciPy 2009 proceedings with Gael and catching up on sleep and email.  For dinner, Gael, Emmanuelle, and I meet Jean-Baptiste Poline at the Denfert-Rochereau station and found a very traditional french wine bar called Au Vin des Rues, which is on rue Boulard just off of rue Daguerre and was open on a Sunday evening.  (I had a delicious slow-roasted leg of lamb with potatoes au gratin and rum baba for dessert.)  The rue Daguerre has a wonderful pedestrian street market I often seem to visit when I am in Paris.  Here is a picture of looking down the rue Daguerre (the street market is closed, of course) toward rue Boulard (you can see JB just right of center with Gael peeking over Emmanuelle's shoulder):




PJ Toussaint, who just flew back from a conference in Greece, joined us just in time for dessert.

Saturday in Paris

Just thought I'd try to keep a little journal of my trip.  We'll see how long this lasts.  Anyway, I landed at Charles de Gaulle at about 6:30am on Saturday morning and took the RER to Bourg-la-Reine to stay with Gael and Emmanuelle.  Here is the entrance to where I am staying:



And the view from my window:



Once I arrived I took a short nap and then went out with Gael to hunt and gather in the market above the passage he lives on.  Here are a two pictures I took at the local market (they had all kinds of things, but cheese and meat are, of course, the things that attracted me most):




We also went into a store called frozen food store called Picard:



After lunch, Gael and I headed to Paris.  Over the last five years, I've tried to visit the catacombs numerous times.  Unfortunately, every time I've visited, they've been closed.  This time turned out to be no different.  The picture on the left is Gael standing in front of the entrance to the catacombs (you can see a sign on the door, which states that they will be closed for the next month) and the picture on the right is of the road behind me:



Since I couldn't visit the catacombs, we decided to head to the Montmarte district to walk around for the afternoon.  Here are a couple of pictures of the Sacre Coeur at the summit of Montmarte:



I forgot to take pictures for the rest of the day, but after Montmarte we headed back to the Latin Quarter to spend a couple hours talking about the SciPy proceedings (which we hope to finish today) in a cafe with wireless internet.  And we grabbed an early dinner at 8pm with Emmanuelle and a couple of colleagues from Neurospin.

Wednesday, November 18, 2009

NumPy 1.4 coming soon!

Nearly eight months after NumPy 1.3, NumPy 1.4 will be released well before the holidays.  This release comes with the usual raft of bug fixes, performance improvements, new features, and improved documentation.

Our web-based documentation editing system continues to be a great asset.  In just over a year, this system has helped us to vastly expand and improve our documentation.  For instance, our reference guide has gone from under 10,000 words to over 110,000.  When Guido came to visit Berkeley a few weeks ago, Fernando Pérez showed him the web-based documentation editor and he was very impressed and even commented that it would be nice for the standard library to use a system like this.

David Cournapeau is again serving as release manager and he is also responsible for much of the code improvements for this release.  I just noticed the other day, that according to ohloh, David is quickly approaching Travis Oliphant's number of commits.  While it is hard to attach any specific meaning to this statistic, it is clear that at this point David is one the most significant contributors to NumPy.  Among his many contributions to this release, he reduced numpy's import time by 20%-30% by adding a small numpy.lib.inspect module and using it instead of the upstream inspect module.  Another very useful improvement by David is that you can now link against the core math library in an extension.

In addition to all the work David's done for the 1.4 release, some of his recent work won't be included until the 1.5 release.  Once David branches for 1.4, he has all ready promised to merge his Python 3 support for numpy.distutils into the trunk.  While we are just beginning to plan migrating to Python 3, this is an important early step.

Unfortunately I am not sure the new datetime dtype support for dealing with dates in arrays will be included in the 1.4 release.  This useful functionality was developed over the summer by Travis Oliphant and Marty Fuhry.  Marty was my Google Summer of Code student; although, I was pretty busy so Pierre Gerard-Marchant did most of the day-to-day mentoring.  Despite the fact that this code was merged with the trunk at the end of the summer, there is a reasonable chance that it will be pulled before the 1.4 release due to the lack of documentation for the public C API.

I've only touched on a few of the many improvements you can expect to see with NumPy 1.4.  For more details about the upcoming release, please see the release notes.  Thanks to everyone who worked on this release and to David in particular.