Hypertree

Complementary Approaches to Constraint Satisfaction

Home           Members           Publications           Software downloads           Work Progress

Software downloads

We have designed and implemented several algorithms for generations of hypertree decompositions. The algorithms have been evaluated for benchmark examples from the literature and the industry. Currently we have implemented the following algorithms:

 

  • Algorithms based on Bucket Elimination
  • Algorithms based on Dual Bucket Elimination
  • Branch Decomposition heuristic
  • Dual Branch Decomposition heuristic
  • Algorithm based on Acyclic Hypergraph Sandwich Problem
  • Algorithms based on Hypergraph Partitioning:
    • Eigenvector heuristic
    • Dual Eigenvector heuristic
    • Partitioning through Fiduccia-Mattheyses heuristic
    • Partitioning through HMETIS Library
  • Algorithm that combines Hypergraph Partitioning Algorithms and Bucket Elimination Algorithm
  • A Backtracking-Based Algorithm (det-k-decomp)

 

Detailed information for the implemented algorithms can be found in project publications:  http://www.dbai.tuwien.ac.at/proj/hypertree/publications.html

 

If you have questions considering programs please sent email to: [Project Name]@dbai.tuwien.ac.at

Benchmarks examples

benchmarks.zip

Executable and object files:

Windows:

htdecomp.exe

htdecompWIN_EV.zip

htdecompWIN_HMET.zip

 

detkdecomp_Win32.zip

 

BE_Win32.zip (faster reimplementation of the bucket elimination based algorithm)

 

 

Linux:

htdecomp

htdecomp_EV.tar

htdecomp_HM.tar

 

detkdecomp_Linux.tar.gz

 

System requirements

Windows: Win32 (Windows 9x, Windows ME, Windows NT, Windows 2000,Windows XP)

Linux:  Programs have been tested in SUSE Linux 10.0

 

CPU and memory requirements depend on the input problem size. However in order to avoid a very large running time you should have at least 128 MB RAM.

 

For graphical representation of output files, which are in GML format, are required Java 2 and VGJ (Visualizing Graphs with Java).

VGJ can be downloaded from this link:

http://www.eng.auburn.edu/department/cse/research/graph_drawing/graph_drawing.html

 

Installing of program

Windows:

  • Copy htdecomp.exe into from you selected directory. With this exe file you can use the following methods:
    1. BE
    2. DBE
    3. SAND
    4. FM_PURE
    5. FM_PURE_WEIGHTS
    6. FM_NAIVE
    7. FM_NAIVE_WEIGHTS

 

  • To be able to use HMETIS heuristic please unzip first the archive htdecompWIN_HMET.zip and copy into from you selected directory. In your directory then you have to download the file libhmetis.lib from hMETIS Library (http://www-users.cs.umn.edu/~karypis/metis/hmetis/download.html). 

Afterwards you have to execute the following command line for linking:

 

LINK /out:htdecomp.exe *.*

 

With this file you can use the following methods:

    1. BE
    2. DBE
    3. SAND
    4. FM_PURE
    5. FM_PURE_WEIGHTS
    6. FM_NAIVE
    7. FM_NAIVE_WEIGHTS
    8. HMET
    9. HMET-BUCKET

 

 

  • To be able to use Eigenvector and Branch Decomposition heuristics please unzip first the archive htdecompWIN_EV.zip and copy into from you selected directory.  In your directory then you have to download the following files from Concorde Library (http://www.tsp.gatech.edu/concorde/downloads/downloads.htm):
    1. allocrus.c
    2. cut_st.c
    3. util.h
    4. machdefs.h
    5. cut.h

 

and the files from (Author:Engeln-Müllges, Gisela Numerik-Algorithmen mit ANSI C-Programmen ):

    1. basis.c
    2. feigen.c
    3. vmblock.c
    4. basis.h
    5. vmblock.h

 

Afterwards you have to execute the following command lines for linking:

             

CL /c *.c

LINK /out:htdecomp.exe *.obj

 

With this file you can use the following methods: 

1.      BE

2.      DBE

3.      SAND

4.      FM_PURE

5.      FM_PURE_WEIGHTS

6.      FM_NAIVE

7.      FM_NAIVE_WEIGHTS

8.      EV

9.      DEV

10.  BD

11.  DBD

           

Linux:

  • Copy htdecomp into from you selected directory. With this file you can use the following methods:
    1. BE
    2. DBE
    3. SAND
    4. FM_PURE
    5. FM_PURE_WEIGHTS
    6. FM_NAIVE
    7. FM_NAIVE_WEIGHTS

 

  • To be able to use HMETIS heuristic please unzip first the archive htdecompHM.tar and copy into from you selected directory. In your directory then you have to create a subdirectory HMETIS and download there the file libhmetis.a from hMETIS Library (http://www-users.cs.umn.edu/~karypis/metis/hmetis/download.html). 

Afterwards you have to execute the following command line for linking:

 

g++ -o htdecomp *.o -lhmetis -L HMETIS

         

With this file you can use the following methods:

    1. BE
    2. DBE
    3. SAND
    4. FM_PURE
    5. FM_PURE_WEIGHTS
    6. FM_NAIVE
    7. FM_NAIVE_WEIGHTS
    8. HMET
    9. HMET-BUCKET

 

  • To be able to use Eigenvector and Branch Decomposition heuristics please unzip first the archive htdecompEV.tar and copy into from you selected directory.  In your directory then you have to download the following files from Concorde Library (http://www.tsp.gatech.edu/concorde/downloads/downloads.htm):
    1. allocrus.c
    2. cut_st.c
    3. util.h
    4. machdefs.h
    5. cut.h

 

and the files from (Author:Engeln-Müllges, Gisela Numerik-Algorithmen mit ANSI C-Programmen ):

    1. basis.c
    2. feigen.c
    3. vmblock.c
    4. basis.h
    5. vmblock.h

 

Afterwards you have to execute the following command lines for linking:

 

          gcc -c *.c

g++ -o htdecomp *.o

 

With this file you can use the following methods: 

1.         BE

2.         DBE

3.         SAND

4.         FM_PURE

5.         FM_PURE_WEIGHTS

6.         FM_NAIVE

7.         FM_NAIVE_WEIGHTS

8.         EV

9.         DEV

10.     BD

11.     DBD

 

 

Starting of program

 

Decomposition program is started by the htdecomp command.

 

htdecomp  -f inputfile  -d algorithm  [-d algorithm]. . . [-d algorithm]  [-o outputfile]

[-def] [-stats]

 

Please note that the sequence of command line arguments does not matter.

The decomposition program will accept the following command line arguments.

 

 

Part                             Description

 

-f inputfile                 A hypergraph H=(V,E) where V denotes vertices and E denotes hyperedges have to be stored in a txt file where each line contains a relation between an hyperedge and vertices contained in that hyperedge. 

                                    The different relations must be separated with a coma. Any line that starts with “%” is a comment line and will be skipped.

                            There are two different forms of the input file dependent on it whether the variables and atoms are already pre-defined or not.

                            A pre-definition is given as:

                                     

<defVar : Var1,Var2,...,Varn >

 

<defRel : Edge1/N1 , Edge2/N2 , . . ., Edgem/Nm >

                            

,wheredefVar’ denotes all vertices, ‘defRel’ denotes all Edges and ‘/N’ shows how many vertices in an hyperedge are contained.

If an pre-definition exists and –def command line is given than an consistency is required (see def ).

 

Example:

 

 

-d algorithm          -d command option require an decomposition algorithm which have to be indicated by ‘algorithm’. Dependent on the selected decomposition algorithm the argument algorithm can require further parameters indicated as construction-scheme (see the algorithms which require a construction-scheme).

                             There are the following possibilities of ‘algorithm’:

                  

         

1.     The algorithms which requires no construction-scheme      

 

·         BE: Bucket Elimination Algorithm

 

This decomposition algorithm uses the topoligical structure of the problem, with property that an optimal variable order produce a tree decomposition of optimal width which

is afterwards extended to a hypertree decomposition by using set covering heuristics. The variable order will be choose heuristically. For more information about Bucket Elimination Algorithm please see project publications  

Example:       

htdecomp –f myFile.txt –d BE

 

         

·         DBE: Dual Bucket Elimination Algorithm

                  

This decomposition algorithm applies Bucket Elimination algorithm in order to produce a tree decomposition of the dual hypergraph, which is afterwards extended to hypertree decomposition by adding chi-labels. For more information about Dual Bucket Elimination Algorithm please see project publications

         

Example:       

htdecomp –f myFile.txt –d DBE

 

 

·         EV: Eigenvector Heuristic

                  

This decomposition algorithm applies the Eigenvector branch decomposition heuristic which produces an branch decomposition of the hypergraph which is afterwards transformed into tree decomposition and a hypertree decomposition. For more information about Eigenvector Branch Decomposition Algorithm please see project publications

                                     

Example:       

htdecomp –f myFile.txt –d EV

 

                            

·         DEV: Dual Eigenvector Heuristic

                  

This decomposition algorithm applies the Eigenvector branch decomposition heuristic which produces a branch decomposition of the dual hypergraph which is afterwards transformed into tree decomposition and hypertree decomposition. For more information about Dual Eigenvector Branch Decomposition Algorithm please see project publications

                                     

Example:       

htdecomp –f myFile.txt –d DEV

 

 

·         BD: Branch Decomposition Heuristic

                  

This decomposition algorithm applies the branch decomposition heuristic of Cook and Seymour which produce a branch decomposition of the hypergraph which is afterwards transformed into tree decomposition and hypertree decomposition. For more information about Branch Decomposition Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d BD

 

 

·         DBD: Dual Branch Decomposition Heuristic

 

This decomposition algorithm applies the branch decomposition heuristic of Cook and Seymour which produce a branch decomposition of the dual hypergraph which is afterwards transformed into tree decomposition and hypertree decomposition. For more information about Branch Decomposition Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d DBD

                                     

 

 

·         HMET_BUCKET: HMETIS - BUCKET Algorithm

 

This decomposition algorithm is a combination of hMETIS (see HMETIS: hMETIS Algorithm) algorithm and Bucket Elimination (see BE: Bucket Elimination Algorithm) algorithm. For more information about HMETIS - BUCKET Algorithm please see project publications

 

 

Example:       

htdecomp –f myFile.txt –d HMET_BUCKET

 

·         SAND: SANDWICH PROBLEM Algorithm

 

This decomposition algorithm finds the hypertree of given hypergraph based on the algorithm for the acyclic hypergraph sandwich problem. The algorithm requires two additionally integer parameters which represents our imposed decomposition width and type of ordering for nodes of input hypergraph.

 

Example:       

htdecomp –f myFile.txt –d SAND Width OrderType

 

                                                Width: Our imposed decomposition Width (Integer value)

             OrderType: Determines type of ordering for nodes of input

                                 Hypergraph, where OrderType:

1 - Normal order of nodes

2 - Nodes are ordered based on MCS heuristic

3 - Nodes are ordered based on Min-fill (MF)  Heuristic.

 

2.     The algorithms which requires a construction-scheme

 

Some decomposition algorithm requires additionally parameters indicated as construction-scheme.

The construction-scheme actually shows us how the heuristics have to be used.

Use:    -d myAlgorithm costruction-scheme

 

There are four different construction-schemes:

 

1)    NORMAL

 

This construction-scheme builds a hypertree decomposition of a given hypergraph by using of already given hypergraph decomposition algorithm.

 

Example:  

htdecomp –d myAlgorithm NORMAL

 

 

2)    LOCALOPT

 

This construction-scheme builds a hypertree decomposition of a given hypergraph by using of already given hypergraph decomposition algorithm, where in each step, the hypergraph is partitioned using all parameters (these parameters are internal already given in the program) one after the other, and the best partitioning is chosen afterwards.

 

Example:  

htdecomp –d myAlgorithm LOACALOPT

 

 

3)    BACKTRACKING

 

This construction-scheme requires an additionally integer parameter which represent our imposed decomposition width. A hypertree decomposition of a given hypergraph is built by using of already given hypergraph decomposition algorithm, where for each parameter (these parameters are internal already given in the program) is tested whether a decomposition of a given width is possible. If not, than the program gives an message like “No decomposition possible with imposed restrictions (…)”.

 

Example:  

htdecomp –d myAlgorithm BACKTRACKING width

 

 

 

4)    EARLYEXIT

 

This construction-scheme builds a hypertree decomposition of a given hypergraph by using of already given hypergraph decomposition algorithm. The hypergraph is partitioned using each parameter (these parameters are internal already given in the program) separately and the best partitioning is temporally stored. If during that the best stored partitioning exceeds, the program breaks further decomposing of the process with that parameter and begins with the next one. The remained stored partitioning is the best one.

 

Example:  

htdecomp –d myAlgorithm EARLYEXIT

 

 

The algorithms which requires a construction-scheme

 

 

·         FM_PURE: Fiduccia-Mattheyses Pure Algorithm

 

This decomposition algorithm applies an algorithm proposed by Fiduccia and Matteyses which is an iterative refinement heuristic which produced an arbitrarily partitioning of hypergraph into two parts (our implementation supports only partitioning into two parts). In order to get an optimized partitioning the algorithm moves successively vertices to the opposite partition. To ensure the ‘connectedness’ condition of the hypertree decomposition after each decomposition step the special hyperedges are introduced.

A characteristic of this decomposition algorithm is not differentiation between ‘normal’ and ‘special’ hyperedges during cost evaluating of the cut that contain such special hyperedges (the weight of each hyperedge is set to one). For more information about Fiduccia-Mattheyses Pure Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d FM_PURE NORMAL

 

 

 

·         FM_PURE_WEIGHTS:  Fiduccia-Mattheyses Pure Weights Algorithm

 

This decomposition algorithm applies an algorithm proposed by Fiduccia and Matteyses which is an iterative refinement heuristic which produced an arbitrarily partitioning of hypergraph into two parts (our implementation supports only partitioning into two parts). In order to get an optimized partitioning the algorithm moves successively vertices to the opposite partition. To ensure the ‘connectedness’ condition of the hypertree decomposition after each decomposition step the special hyperedges are introduced.

A characteristic of this decomposition algorithm is differentiation between ‘normal’ and ‘special’ hyperedges   where the special hyperedges have associated weights and are not as is the case at Fiduccia-Mattheyses Pure where the weight of each hyperedge is set to one. For more information about Fiduccia-Mattheyses Pure Weights Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d FM_PURE_WEIGHTS  NORMAL

 

 

 

·         FM_NAIVE:    Fiduccia-Mattheyses Naive Algorithm

 

This decomposition algorithm is a variant of implementing of algorithm proposed by Fiduccia and Matteyses which is an iterative refinement heuristic which produced an arbitrarily partitioning of hypergraph into two parts (our implementation supports only partitioning into two parts). For more information about Fiduccia-Mattheyses Naive Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d FM_NAIVE NORMAL

 

 

·         FM_NAIVE_WEIGHTS:        Fiduccia-Mattheyses Naive Weights  Algorithm

 

This decomposition algorithm is a variant of implementing of algorithm proposed by Fiduccia and Matteyses which is an iterative refinement heuristic which produced an arbitrarily partitioning of hypergraph into two parts (our implementation supports only partitioning into two parts). For more information about Fiduccia-Mattheyses Naive Algorithm please see project publications

 

Example:       

htdecomp –f myFile.txt –d FM_NAIVE_WEIGHTS NORMAL

 

 

·         HMETIS: hMETIS Algorithm

 

This decomposition algorithm is based on hMETIS algorithm which according to the literature is one of the best available packages for hypergraph partitioning. For more information about hMETIS Algorithm please see project publications.

 

Example:       

htdecomp –f myFile.txt –d HMETIS NORMAL

 

 

 

- o outputfile                         The output of  htdecomp contains the hypertree decomposition computed by the given algorithm/s and have an GML (Graph Modeling Language)  format , so you can see an visualised hypertree.  To do that please download Java2 and VGJ Tool (see  System requirements) and execute vis [otputfile] to start VGJ and visualize hypertree decompositions.

                                               If no outputfile is given than an default outputfile exist consisting of  inputfile in form of  inputfile.gml

                                     

Example:       

                          vis myOutputFile.gml

 

 

-def                      If at the inputfile the variables and atoms are already pre-defined (see -f inputfile) the -def command option checks the consistency between pre-defined variables and pre-defined atoms and variables and atoms which are read; If no consistency exists the program give an error-message and aborts with exit(EXIT_FAILURE); If one ore more atoms and variables are not pre-defined the program give an error-message and aborts with exit(EXIT_FAILURE);

 

Example:      

        htdecomp –def –f myFile.txt

         

-stats                         A program writes detailed statistics about decomposition, like:

·         Name of used algorithm and heuristic (e.g., BE[MCS]);    

·         Time at which decomposition started and finished, respectively

·         Hypergraph which was decomposed;                  

·         Hypertree-decomposition of hypergraph;             

·         Optional comment written to statistics file with special informations which are not unique for the different decomposition algorithms;

 

Example:      

htdecomp –stats –f myFile.txt

 

 

 

 

2006 ©  Database and Artificial Intelligence Group, Vienna University of Technology