This reflects a change of tack for the BasilProject, previously, notes have been unpublished. ---- 01.08.04 - For a MastersThesis, I am looking at rolling out more of the BasilParserDevelopment component, and writing a thesis that focuses on the BasilTestStringApplication. The bad part is I am using code I haven't worked with for up to seven years. I'm also trying to figure out proper coupling between Basil parser integrations and Basil models. I think I am going to keep AbstractSyntaxTrees as a separate tupling, but I'm going to need to standardize the AbstractSyntaxTree format so you can build good handlers for the model. I have defined a BasilGrammarModel as a DTD, and generated a factory for it. This is a good example of how to use the model framework: % DTD2PyModel.py BasilGrammar.dtd > BasilGrammarModel.py Next steps: * Write adapters from the various grammar parsers that instantiate BasilGrammarModel instances. * Write a handler that adapts the BasilGrammarModel to something the BasilTestStringApplication can choke down. * Write a handler that creates AbstractSyntaxTree model handlers. * Use the tool previously mentioned to write a PythonBytecodeCompiler. Other recent developments include me moving the code base to a SubVersion repository on CodeSpeak (which also hosts PyPy.) I'll need to figure out how to do the proper indirection from SourceForge. I wasn't able to get my pgen module into Python 2.3, but I've put it in my code base (along with the documentation I drafted for PyCon.) What is more bizarre is that I've reimplemented pgen in the PythonLanguage. ---- 01.18.03 - I would hate to say the project is dead, so I'm not going to do that. About a year and a half ago, I pitched a proposal to expose the pgen parser generator to the PythonLanguage interpreter (PythonEnhancementProposal 269). After getting some interest in its completion from GuidoVanRossum, I have finally got a prototype working that I don't mind releasing. I have submitted an abstract to PyCon on the matter and should put the patch in SourceForge in a day or two. This reflects another change of tack, as I was previously working on the BasilMetamodeling component in the entries below. Using a pgen module in PythonLanguage, I think I should be able to sprint into development of parts of the BasilParserDevelopment component. I found a UML tool for PythonLanguage (PyUt) and will need to look into sharing code. They don't seem too far along in their XMI integration, and maybe they could help complete my work. Python should only require one XMI module per UML revision. ---- 10.12.00 - I am almost at a point where I think I have a valid NSUML clone in Python. The basil.modeling.uml.NSMeta2Py module has come of age. The primary thing left to do is the elment/attribute mapping component needs to be computed and add to the model factory class constructor (this also means that names of elements/attributes need to be mapped...it should also be noted that the BaseModelFactory capitolizes[zp?] the first letter of the element name when looking for a creation class in the factory so this needs to be reflected when the factory class is generated.) ---- 09.26.00 - Need to bang out some of the details of the NSUML meta-model. How is the generated library serialized? The easiest answer is to consult the NSUML library itself, but I have been avoiding this because of a bias against the TrendyLanguage that was Java (I wonder what the next hot language will be, not C#, I hope.) From what I have seen from NSUML 0.3.6, the meta-model is built and then processed in a similar fashion to the model elements in Basil, but it processes the model separately for each aspect of the library (import and export have their own classes.) The library is built, and then the other serialization and factory classes are built in separate passes of the meta-model. This is somewhat interesting, but there is a trade off between speed and keeping code generation functionality separate (and therefore somewhat easier to deal with.) Frankly, I dunno if this is making any sense since it is getting late, but I think I have some insight. The factory class for the UML library will try to provide creation methods for each class in the meta model using only the class name (package containment is worried about behind the scenes, esp. for import and export reasons.) In the event of name collisions, it will simply prepend the containing packages for both names, continuing this process until no more names are in conflict. The rationale for this decision is to keep method names for the creation function from getting out of hand. Example: package Spam: class Egg; package Tonnage: class Sparrow; class Weight; package Velocity: class Sparrow; ================= class modelFactory (...): def createEgg (); def createTonnageSparrow (); def createWeight (); def createVelocitySparrow (); ---- 08.21.00 - Reversed the order of the log; the newest log entry is at the top of the flat Wiki page. Wanted to note that the underlying design of the model framework is now documented at: http://wildideas.org/basil/modelframework.html Since I am now able to process the NSUML generator format, this implies that I now need to add the modeling framework into the Python code generator. Then I would be able to do the following: % handle.py uml/NSMetaModel uml/NSMeta2Py uml/NSUML.xml nsuml % internalize.py nsuml This will require that modeling/uml/NSMeta2Py.py be extended to add the IModelFactory interface and BaseModelFactory subclass into the root __init__ module. Once this is accomplished, this will provide some rough XMI integration with NSUML. (Note that NSUML has released their model framework generator, NGEN. This needs to be looked at and patterns extracted for possible use in Basil.) More work needs to be done in mapping the NSUML meta-model to the MOF, and the MOF model onto code (Python code specifically.) Another route to XMI integration is the following scenario: % DTD2PyModel.py uml/uml1_3.dtd > uml/uml1_3Model.py % internalize.py uml/uml1_3Model This will require an improvement in element name handling that may have been mentioned in previous entries. The part I don't like about this integration is that the model will be dumped into a flat file without any attention paid to packages (for example, using DTD2PyModel.py, X.Y.Z would be placed into the flat model as X_Y_Z.) Perhaps a XMIDTD2PyModel.py may be in order, depending on the expected pay back of using the NSUML generator framework to gain access to the root OMG UML meta-model. Other solutions may include a custom XML handler that maps over to the NSUML UML meta-model generated library. ---- 08.08.00 - Finally got around to making internalize.py work. Need to synch current work in CVS, but current test files are owned by another project (NSUML, also on SourceForge.) The tests that the checked in source pass will therefore not be in the CVS (I am working on this issue.) For those unable to replicate my results, here is what you would do if you were me (once again, from the modeling directory of the basil repository:) % DTD2PyModel.py uml/NSMeta.dtd > uml/NSMetaModel.py % internalize.py uml/NSMetaModel uml/NSUML.xml [Runs with no errors.] Note the interface to internalize.py is different from the one indicated in my entry for 7.13.00, the model module is specified without the ".py". Dry run of the handler framework is as follows: % DTD2PyHandler.py -b NSMeta2Py uml/NSMeta.dtd > uml/dummy.py Now, using a more thought out interface to the handleModel objective: % handle.py uml/NSMetaModel uml/dummy uml/NSUML.xml [Runs with no exceptions raised (but also with no visual feedback that the model was walked properly.)] By the end of the day, internalize.py, handle.py were added to the CVS repository, along with the NovoSoft metamodel to Python code generation handler. Able to generate Python using the following command: % handle.py uml/NSMetaModel uml/NSMeta2Py uml/NSUML.xml test [Outputs some rough debug text, along with generating a subdirectory titled 'test'] ---- 07.28.00 - Updating log like I said I would...still working changes to the BaseModelFactory class and its automatically generated subclasses. ---- 07.25.00 - Continuing to test internalize.py. Need to establish a consistent mapping between the lexical conventions of XML and Python. Specifically, XML allows attributes of an element to use the period character. One strategy would be as follows: map lexical violations to underscore characters, then add a class attribute that maps from the Python class attribute name to the corresponding XML attribute. This allows externalize as well as internalize operations on library objects. -- This is not needed. The map could be stored in the factory class which already keeps track of element names and attributes. ---- 07.21.00 - Trying the test documented in the previous entry. Finding lots of little bugs. Currently trying to debug a problem in the DTDParser component where a node data element is created with a list instead of a dictionary for its attributes. (Occurring while using internalize.py.) Ohh yeah, wrote the internalize.py script. Currently I am not pleased with its behavior because the imp module does not split up the module search by dots (ie. I can't say "uml.NSMetaModel" as above, I have to chdir into the "uml" directory and then import "NSMetaModel".) ---- 07.13.00 - Currently trying to build the modeling infrastructure so that the Basil/Python XMI integration may be automatically generated using metamodels taken from NSUML (http://nsuml.sourceforge.net/). The ideal unit test situation would be as follows: [In modeling subdirectory...] % DTD2PyModel.py uml/NSMeta.dtd > uml/NSMetaModel.py % internalize.py uml/NSMetaModel.py uml/NSUML.xml [No errors encountered.] % DTD2PyHandler.py -b NSMeta2Py uml/NSMeta.dtd > uml/NSMeta2Py.py [Modify uml/NSMeta2Py.py to map from the NovoSoft metamodel document type to PythonLanguage constructs via the PyExternals? module.] % modelProcessor.py uml/NSMeta2Py.py uml/NSUML.py uml/NSUML0 [Generates a Python model package that includes the IModelFactoryModule interface at the top level, thus granting an initial XMI integration!] The previous scenario determines the creation of two new scripts: internalize.py, and modelProcessor.py. These scripts will accept IModelFactoryModule and IModelHandlerModule compliant Python modules, respectively. internalize.py will internalize an XML document into the model who's factory is passed (acting as a semi-validator???) modelProcessor.py will run a model handler over the XML document.