Too much concepts of "library", "package", "plugin"


It will be hard for users to understand these things. Even if they do, they will probably not be happy with so many constrains and dependencies, when they are going to develop their own libraries under RESTFramework. Splitting workload is good, and widely accepted. However, it shouldn’t go like that for a light weighted software. For common user, they just want a plain list of “components” that adds functionality to REST. These “components” shall be flexible to install/remove(no cross dependency). Their naming/functionality shall not cause confusion.

In the currently scheme, We have restG4 package dependent on RestG4Lib library. We have RestRawLib dependent on RestDetectorLib. This is not good. If the user wants to install geant4 capability, he needs to switch on both restG4 and restG4Lib and RestDetectorLib. And he may wonder what the two “G4” differs. If he delete/update libRestDetectorLib.so accidentally, the whole program will probably not work. We should prevent this.

In my opinion, a single layer of extended libraries is enough. The library shall not be constrained to be process-oriented.

The RestDetectorLib shall better be merged inside RESTFramework. Then the other libraries are independent with each other. People can add/update/remove the corresponding .so files freely. It is also reasonable to add those two event types as basic event type, since we are aiming to process TPC data.

"package" can be integrated into “libraries” through TRestTask interface. The executables provided by “package” is nothing but a main function receiving certain arguments. I think it is OK to rewrite them as a TRestTask-derived class, implementing the virtual method TRestTask::RunTask() just as the main functions for executable. Then, running executable of rest package is equivalent to calling restManager PACKAGENAME. Then, all the proposed “packages” are running within framework.

In addition, we need to define interface for some metadata class. Then we can add various “metadata libraries”. The best example is the previous implementation of restDecay0. We change TRestParticleCollection into a kind of generator wrapper interface class, and then add a class to inherit it. The inherited class should follow the naming convention: TRestParticleCollectionXXX, where XXX is the package name. Then, when the framework wants to use a specific generator(specified by the user in rml file), it will find the package and instantiate the class according to the name. (In the code of the interface class, we don’t know package name at all!)

In summary, the idea of this library architecture is that we define various interface in the main framework, and change the default behavior by adding/removing different kinds of “libraries”, which implements the interface. The advantage is that, first, we don’t have complex dependencies between those “components” anymore, second, it is easier for the user to learn REST.

I think this technique is also very common in varies projects. For example, when a game developer wants to enable mods(which changes game character appearance, weapon damage, etc., when installed), he will leave the related methods virtual, and load the mod libraries dynamically. The program will reflect the derived classes and call the implemented methods if they exist. (In this process the program also don’t know the mod name)

A user does not need understand these things. A user who is not going to contribute to REST does not need to understand this.

When they are going to develop a library in RESTFramework they do not need to think about any dependencies, which dependencies? And which constrains? I don’t understand what do you mean by constrains, but in the new scheme I am not adding any new additional constrains. In the other hand, in case of a new full library they should develop whatever own processes, that operate in their own particular event data type. If they want to connect to another REST library, there is a dependence, but this will be transparent to them

The CMakeLists provided in the REST dummy template library will be clever enough to find automatically installed dependencies so that the user does not bother about it.

This scheme it is also important for people who will contribute in future with processes and/or libraries. They must understand the concept of library in REST.

No. This is not true. A user might want just to read existing Geant4 files previously generated with restG4 by other user. So, he does not need to install restG4. Just using libRestGeant4 will allow him to do read those files.

Thats why is important to understand the concepts. restG4 is a Geant4 code that uses REST and ROOT libraries to generate data that will be understood by REST. It is a package.

I don’t see how RestDetectorLib will be not available. Why should I put something to disable it? This library will come by default with RESTFramework.

Thats where documentation plays a role.

It is not. If you want to produce complex data processing chains out of simple/independent libraries.

I don’t understand what do you mean by be process-oriented. A REST library are specific objects inheriting from TRestEvent, TRestEventProcess and TRestMetadata, and any other helper classes related. Perhaps do you want to introduce another concept?

No way. The nature of RestDetectorLib is much closer to the nature of other libraries than to the nature of the framework. The framework prototypes processes, event types, and metadata. Implements the management, processing and access to the data. And it introduces common tools.

Why to merge it with a library that defines specific processes. I don’t see again the problem, this library (detector) will be installed by default. So, I see no real change. Just a better structure so that is easier to differentiate concepts.

On top of that there are several advantages.

  • When we have new changes to the detector library it will be obvious in the future. We will just track the changes happening individually on the detector library, and other libraries.
  • We will be able to keep an independent versioning of the detector library, related to specific processes, metadata, and event data.
  • It will be clear when the framework is stable, or when the detector library it is, or other parts are.
  • We follow KISS philosophy.
  • The processes are grouped by functionality.
    • Geant4Lib: Access and store Geant4 data, define Geant4 simulation conditions, etc.
    • RawLib: Access external data generated by a DAQ, define processes that work on signal conditioning.
    • TrackLib: Library dedicated to more sophisticated event with inheriting capabilities, hits - cluster - track concepts, used for pattern recognition and topology.
    • DetectorLib: Defines the detector properties, as readout and gas, and serves as central part to interconnect different libraries.

Even if detector library is a directory outside the framework directory (i.e. they are not merged), they are still able to add/update/remove any .so files they are willing to.

I agree with that. REST was born to work with TPCs, so detector library is a central library, we will never remove this library from REST. But I can envisage to use the features of the framework, and exploit its capabilities to build libraries that do not depend on the detector library. Also for the reasons above, I believe is more clean to be able to differentiate them (framework and detector).

Ok, this is a possibility. It seems reasonable to do that. Although it implies work to make existing packages to be written using this philosophy, and create a dummy package so that anyone can use as an example/reference to create a new package. Still, a package should be able to run independently as it is doing now, using an executable, why not. Also, this is something new, in general people is used to create c++ programs, if you have to learn to integrate a package into REST, users will not be happy, they have to learn an additional thing.

But always, a package and a library are different concepts, and should be kept in different places. A package is an executable independent program, a library implements classes and methods and inherits from REST framework.

This really starts to make material for another topic, it is becoming too dense. Still, as implemented today in packages/restDecay0/TRestParticleCollectionDecay0. This is not a package. Because, it is a class, and because it inherits from TRestMetadata.

What you call a package for me is a plugin. The plugin can just be distributed (or not) with the library. And enabled/disabled. The package in REST literature is a program.

Which complex dependencies? I don’t see how it can be easier to learn REST adding more advanced features. To learn REST, meaning use REST, users need to see examples and study what particular processes do, and learn common tools, as accessing analysis tree, producing systematic plots, etc. Documentation is the matter, not the implementation.

Actually these are just my ideas for the framework archithchiture. Thanks for reading and reply.

So this library will be installed by default. Then I have no problem with it.

My centeral idea is to build those optional functionalities into “library”(the actual c++ library file, .so file). We should not care whether they are called “plugin” or “package”. For the restDecay0 project it is still a same topic. It is designed under this idea.

Do you mean by enable/disable some code with C++ preprocessor? Like what we do in TRestGas? It is not as good as the restDecay0 solution.

For example, in future we want to add new generator wrapper for muon(with external program “cry”). Similar as restDecay0, we just need to create a new project and install a new library file containing a new class TRestParticleCollectionCry. Nothing else will be changed.

However if we want to integrate a new gas simulation program, we need to change the code of TRestGas. Only one gas simulation program can be integrated, and the code will be very long and hard to read.

Yes, it is just a matter of rearranging things. Not about remove plug or unplug.
Personally, for the project management the strongest point is to be able to track the development of detector processes and other related classes independently, and identify a version number with this specific library. For me that’s a real need.

We should not care, we may agree other naming convention if you like. But somehow, when contributing we must accept few common definitions so that different contributions from different people come in a coherent way.

Yes, the topic of arrangement of files belongs to this topic. I believe ParticleCollectionDecay0.cc and ParticleCollectionDecay0.hh should just be inside the RestGeant4Lib directory, and compile together.

Do we agree that the nature of packages/restG4/ and packages/decay0/ is not the same?

Or maybe not. For example, in future we want to add new generator wrapper for muon(with external program “cry”). Similar as restDecay0, we just need to create a new project and install a new library file containing a new class TRestParticleCollectionCry. We don’t want to update the whole RestGeant4Lib, do we? The update is not geant4 thing. It is a new generator wrapper. And if it is the user who wants to do this, he must not have the premission to update RestGeant4Lib. It is easier for him to create his own wrapper library.

We can also update TRestGas in a similar way. Currently, if we want to integrate a new gas simulation program, we need to change the code of TRestGas. Only one gas simulation program can be integrated, and the code will be very long and hard to read.

Yes. Folloing the current concept of naming, we may move packages/decay0/ elsewhere in future.

It is not, but it is only used by restG4, that means Geant4. The idea is to add other simulation packages that will make use of Decay0, Cry generators? Then it could be somewhere else, the interface you were talking about then could be somewhere in the framework then.

Answer connected to previous answer, if we stick to use only Geant4, then it looks more like a Geant4 library option, and makes sense to update/upgrade the Geant4 library adding this new feature, plugin, option, or as we want to name it.

Do you know any other simulation programs of gas properties? Magboltz is quite widely used in gaseous detector community. I am not aware of others.

Anyway, it could be defined just as an option inside TRestGas, and implement both inside with a #define. Adding an additional interface adds complexity, and I do not expect to have 10 different gas simulation programs.

If this is the idea, I think we would just rename the Geant4 library to something like libRestParticlePhysics.

Then, RestGeant4Lib is a sub-module for REST, and RestDecay0 is a sub-sub-module for RestGeant4Lib. It feels strange to me.

I don’t mind to make TRestParticleCollection a framework class. It is now a generator wrapper, and a generator isn’t necessarily used by geant4. It is possible that in future we do some toy MC using simple particle information provided by TRestParticleCollection.

I don’t know either. But we may meet the demand in future. The possible case is that, for example, someone using liquid TPC may want to use REST to simulate their electron drift process. We already have the process TRestElectronDiffusionProcess. It uses TRestGas querying its diffusion constant, which is same for liquid. In this case we can change TRestGas into TRestDriftVolume at first, and make it working as TRestParticleCollection does. Then we just need to add a library containing a class TRestDriftVolumeLiquid, implementing method GetXXXDiffusion(). People can reuse TRestElectronDiffusionProcess! They don’t need to write a new process.

Another example is TRestDataBase. It shall work in a same way to support multiple sql packages (mySQL, pgSQL, etc.) as the user requires.

In conclusion, a good way to plug in functionalities is to define interface for various classes, and implement them through independent libraries. Many classes need it.