Warning: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in ..../includes/class_bbcode_alt.php on line 1270

Warning: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in ..../includes/class_bbcode_alt.php on line 1270

Warning: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in ..../includes/class_bbcode_alt.php on line 1270

Warning: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in ..../includes/class_bbcode_alt.php on line 1270

Warning: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in ..../includes/class_bbcode_alt.php on line 1270

Warning: Cannot modify header information - headers already sent by (output started at ..../includes/class_core.php:5755) in ..../blog_external.php on line 378

Warning: Cannot modify header information - headers already sent by (output started at ..../includes/class_core.php:5755) in ..../blog_external.php on line 378

Warning: Cannot modify header information - headers already sent by (output started at ..../includes/class_core.php:5755) in ..../blog_external.php on line 378

Warning: Cannot modify header information - headers already sent by (output started at ..../includes/class_core.php:5755) in ..../blog_external.php on line 378

Warning: Cannot modify header information - headers already sent by (output started at ..../includes/class_core.php:5755) in ..../blog_external.php on line 378
thinBasic: Basic Programming Language - Blogs - Petr Schreiber https://www.thinbasic.com/community/blog.php?565-Petr-Schreiber thinBasic Basic Programming Language discussion forum en Wed, 17 Aug 2022 01:31:17 GMT vBulletin 5 https://www.thinbasic.com/community/images/misc/rss.jpg thinBasic: Basic Programming Language - Blogs - Petr Schreiber https://www.thinbasic.com/community/blog.php?565-Petr-Schreiber Testing ThinBASIC: Starting at the end https://www.thinbasic.com/community/entry.php?217-Testing-ThinBASIC-Starting-at-the-end Sat, 21 Mar 2020 17:13:00 GMT In the previous blog post I presented areas in which thinBASIC ecosystem could be improved in order to harden the stability of both the core and... In the previous blog post I presented areas in which thinBASIC ecosystem could be improved in order to harden the stability of both the core and modules, including those developed by you using ThinBASIC SDK.

Today, I would like to share my views on how looking at the end might give us ideas what we need to count with before we will start implementing first line of test supporting functionality.

What kind of information do we expect from tests?

We need to raise confidence thinBASIC works as we expected. ThinBASIC can be seen as a mix of core language functionality and supporting functions.

Just by having a quick peek at thinBasic help file, I can see over 2500 functions in default thinBASIC installation.

The functions are organized in modules and each module has some further logical structure.

To give example, even the Core module, providing the most elemental functionality, can be further divided to string functions, flow control and many others.

This hierarchical structure is calling for analogy in the test hierarchy.

During preparation of new thinBASIC iteration, we will need to have both new and old functionality tested.

Even if we would take just the functions in the account, we don't want to be exposed to results of 2500 functions and all of their tests at one time. The total test count will be easily in tens of thousands!

We need a structured information, not a list of 2500 test results for review.

What kind of test structure can help support it?

The natural idea in context of thinBASIC project is to structure the tests per module. I would also suggest to our futures ourselves to tightly bind the module code with module tests.

This model is already applied in the case of stringBuilder module repository, for example.

This approach has multiple advantages, each time you make the change to code:
- you can run the attached tests to check you did not break anything
- you can prepare a new tests in sync with the change on one place
- you can have the passing tests as the requirement to integrate code change

The last one is super useful, if you manage to setup your process in a way the tests are run before the change integration. And especially useful, if multiple persons collaborate on a single module.

Running tests before change integration will be possible for thinBASIC as for example GitHub Actions allow to spin virtual Windows agent to run user defined tests now, does not matter which framework.

As you can see, the module level structuring is a direction worth pursuing, and also something all the modules in thinBASIC can have in common.

Do we need to divide tests beyond the module level?

Simpler modules, such as StringBuilder, have a single purpose and no further divison might be needed.

On the other side, we have larger modules, such as Core or TBGL, which have more logical groups inside.

The number of logical groups and their nesting differs from module to module.

Having the finer control over test groups could allow us to review results more easily and also to run just specific parts of tests when needed - with great speed benefit, useful for example during development.

Conclusion

Looking at the current module situation, having the tests hierarchically structured seems reasonable for the following reasons:
- ability to break huge amount of tests to logical groups
- ability to run/evaluate only particular group of tests during development
- ability to quicky evaluate failed test origin once we run the complete test set

The example of test division, on example of Core module, could be:
Core -> String handling -> Mid$

...and then multiple tests in this nested group for each possible case of Mid$ usage.

The task to do in one of the next blog posts will be to think about how we can aid structuring on test definition and test result level. ]]>
Petr Schreiber https://www.thinbasic.com/community/entry.php?217-Testing-ThinBASIC-Starting-at-the-end
Testing ThinBASIC: Past, current and future approaches https://www.thinbasic.com/community/entry.php?215-Testing-ThinBASIC-Past-current-and-future-approaches Thu, 27 Feb 2020 21:46:53 GMT ThinBASIC project will celebrate 15 years since its first public release this year in August. It is really a joy to see how much it evolved, how the... ThinBASIC project will celebrate 15 years since its first public release this year in August. It is really a joy to see how much it evolved, how the user base changed and what incredible amounts of excellent project were finished with it.

While the original "thin" thinBASIC core was accompanied by a few modules, such as file manipulation or 3D graphics, the current version has over 50 specialized extensions, which make thinBASIC much more versatile tool than the one from more than ten years ago.

And modules are not the only changes - the core language evolved from something very similar to thinBASIC predecessor, BINT32, to complex language with support for advanced constructs, such as overlayed variables (DIM..AT), user defined types with methods... and many other improvements to help users optimize speed and organization of code.

Development is fun, while at the beginning it was relatively easy to keeping track of changes and features, ensuring everything still works after so many adjustments and extensions is a very complex task, which cannot be done without tests.

Tests which verify basic functionality is consitent, tests which ensure module functions work as documented, tests which guard if thinBASIC can install and execute on all the targeted Windows platforms, tests which guard the performance is on reasonable level.

First approach
I think the first systematic attempt on "regression" testing was contribution of user BugZapper, who created a script which is still present in SampleScripts. It consisted of multiple test functions for operators, program flow control statements and elemental string handling. Each of them reports if the test passes or fails.

The SampleScripts proven to be a good set of samples to execute with the new version and see if it behaves as expected.

The downside of the mentioned approaches is:
- the bugZapper script was not further extended, it covers just the most basic cases
- the test definition had no formal structure, these were really "just" functions
- there is no file output of the script, no record of the execution, besides console output to be read by human operator
- the number of SampleScripts grew and there is no way to run them all manually in a reasonable time - and to remember what are they supposed to do
- there is no guarantee these tests are run before each release

Besides these two approaches, we mostly relied and rely on providing new functionality in so called preview versions, which were expected to be potentially broken and tested by community volunteers.

Second approach
Much later, as I learned more and more about software testing, I came up with the idea of unit test framework in thinBASIC.

Yes - I hear you. Unit testing should be responsibility of the core/module developer, right? I agree, but PowerBASIC, which was and is used for thinBASIC development, has no formal support for unit testing. It also does not have a concept of isolated code unit as well. Also - as we prepared thinBASIC SDK in general way, also C, Pascal, and other languages could, can and were used for development. Each of them would have its own, specific approach, if any, producing different kind of output.

This is why I wanted to solve the issue on module level, testing the already compiled DLL. It would have the advantage of same formal way to write tests for each module developer and also verifying the successful integration with thinBASIC at the same time. After all, it is the DLL interface exposed to thinBASIC what users use and rely on.

The project was called uniTest, and is still available on GitHub:
https://github.com/petrSchreiber/uniTest

It had multiple advantages:
- formal rules for test definition (functions started with test_, library of dedicated assert functions)
- test runner, with automatic test discovery (no longer need to call all the test functions explicitly)
- saving the test result in custom XML-like format, allowing to check for failed tests even after testing finished and process quit

I started to write complete thinBASIC regression for core functions using the system.

I also carried the system to some modules I build for thinBASIC, such as StringBuilder or INI module.

You can see example of module test here, for StringBuilder:
https://github.com/petrSchreiber/thi...Builder.tbasic

These tests have still the following disadvantages:
- the output format is non-standard
- user has to explicitly wire the test execution to test definition scripts (way simpler, but still)
- there is no guarantee the tests get executed before thinBASIC release
- everything goes down in case of RunTimeError - no other tests are performed in such a case

Third approach - going further
I am really excited from the projects the community brings us, and I am super happy from new enhancements to the language by Eros. Both brings me a lot of endorphines each year. I am addicted to thinBASIC :p

I would like to push the way we can maintain and improve quality even more this year to support further creativity.

I have a few technical ideas, I have few ideas regarding when and how to widen the test base.

As for the technical ideas, I would like the build on uniTest and bring improved Module Testing Framework, with these features:
- keeping the definition via thinBASIC script (enables same approach for all module developers, independent on language)
- no need for test runner being part of the test unit (enables pur test code units)
- introduce standardized test output, which will adhere to existing industry standard - JUnit XML or some other proven container (enables ability to use existing test result visualization tools)
- ability tu run the tests asynchronously, even if only as when explicitly requested (enables faster execution)
- ability to run the test at GitHub directly, ideally using Travis CI or GitHub Actions (enables running tests with pull request, preventing integration of breaking change)

As for when and how to widen the test base:
- I see it as absolutely necessary to have 100% core functionality coverage (it is the base of all scripts and must be reliable)
- I see it as absolutely necessary to write tests for fixed issues (in order to fix the correct behaviour and prevent the issue from reappearing)

Keep your fingers crossed, the further progress and code will be shared on forum and GitHub, of course.

Also - please keep in mind no automatic test procedure can solve all issues. After all, it is us, humans, who use thinBASIC. This is why it continues to be important to share opinions on released thinBASIC versions and give us feedback on forum :drink:


Thank you for all the support,
Petr

P.S. I do realize there are cases hard to automatically test, such as TBGL (3D) or TBASS (Sound). I would like to focus on these once the above is in usable state. ]]>
Petr Schreiber https://www.thinbasic.com/community/entry.php?215-Testing-ThinBASIC-Past-current-and-future-approaches
Your first GitHub fork https://www.thinbasic.com/community/entry.php?204-Your-first-GitHub-fork Sun, 08 Jan 2017 09:59:54 GMT The introductory blog post (http://www.thinbasic.com/community/entry.php?203-Idea-for-handling-your-project-code-in-2017) tried to sum up the... The introductory blog post tried to sum up the advantages of GitHub for your thinBasic project storage and publishing.
Let's leave the cold theory and embrace some practical example! This post will describe the basic workflow for founding a new thinBasic project at GitHub.

Grab the fork
Multiple times, across the original article, I mentioned the term fork. As it might not be obvious, let's have a look at what does it mean to create fork of GitHub repository.

Everybody can create a project and GitHub, and everybody, unless you disable it in settings, can base his further work on it.

While the basic collaboration could be setup via multiple code branches inside single repository, it is wiser to adopt the fork approach. It helps you to isolate your solution better if needed, and avoids accidents with global impact.

Forking a repository means creating a copy of the original one, copy which is still linked to the original. You may decide to not pick the advantage this connection offers anytime later, but you will mostly benefit from it, especially on the beginning.

How?

I presented a GitHub project template for thinBasic in the introduction. It contains some initial setup. Imagine, that in the future, some further setup will be needed for thinBasic GitHub integration - for example, adding information about repository language as thinBasic, once it will be possible.

In case you would have a completely separate repository, you would have to do that manually. If you base your project as fork of GitHub project template, you can easily resync your code base via so called rebase. I will dedicate more explanation to this concept in some further articles, for now please understand that it is an easy way to update your project, performed via single command.

Creating new project
Enough talk, let's get practical:


Once you confirm, you will be taken to fork made in your repository. It will share the same name as original, here is example for my case.

Click image for larger version. 

Name:	ForkedRepository.png 
Views:	499 
Size:	29.6 KB 
ID:	9643
You can make two observations at the moment:
  • the repository name is the same as the original
  • the description still talks about this being an template


As we will want to use it for storage of our custom solution, we need to address these somehow.
For both, there is an easy way out. Let's start with name change.

To change the project name, pick the Settings tab. There is a convenient option of Rename repository. For demonstrational purposes, I will upload my interactive PF 2017 there, so I will call my repository pf-2017 and confirm the change.

To change the project description, I hit the Code tab, to get back to the default view. In the code listing, please click the README.md.
Once the page reloads, search for the icon of pencil to edit the file directly in browser.

Click image for larger version. 

Name:	EditButton.png 
Views:	502 
Size:	27.9 KB 
ID:	9644

The view will change to source editation. The README.md is "coded" with markdown syntax. If you prefer some WYSIWYG, you may try online tool, such as StackEdit.io. Once you are satisfied with your project description, you are about to make your first commit!

Commit is a batch of changes, which appears in the repository. It is good to think about good commit message, which describes the change.
I personally follow this convention:
  • "fix: <bug description>"
  • "feat: <feature description>"
  • "refactor: <refactoring change explained>"
  • "docs: <what was added>"


For this case, we could go with docs, as we are not touching code, but its documentation. So I would scroll down, fill in "docs: readme adjustment to match the project" and hit Commit changes.
You should see the change immediately, and even when you switch back to Code tab, the repository will be now described in its lower part with adjusted explanation.

There might be left the original description of the template on the top, but that is not bound to code, just GitHub setting. You may edit it via dedicated button.

Adding your code
Now its time to add your own code to the repository. While it can be done completely via the online interface, I would recommend to use GitHub for Windows from this point on.
Once installed, it nicely integrates with the GitHub online interface - to clone the changes we just did to local hard drive, hit the Clone or download button, and choose Open in desktop.

Click image for larger version. 

Name:	CloneOrDownload.png 
Views:	545 
Size:	32.4 KB 
ID:	9645

Browser will ask you whether you want to open it with GitHub application, please allow that.

You will be prompted where should be the repository downloaded. I recommend you to create something like Repos, directory on your data hard drive, and choose that one.
New directory, named as your repository, will be added inside Repos then. It should contain 3 files we know, and you may also see hidden .git folder, which is used for some local tracking.

Name:  StateOnHdd.png
Views: 373
Size:  7.7 KB

Head to the directory, and place some code there. In my case, it were the two files. I tend to place tbasicu in units folder, and main code in the root, where README.md is.

Name:  StateOnHddAfterChange.png
Views: 378
Size:  11.5 KB
Once you switch back to the GitHub for Windows application, you can do 3 simple steps.

Click image for larger version. 

Name:	GitHubForWindowsAfterChange.png 
Views:	511 
Size:	53.7 KB 
ID:	9648



  • Click the Changes button, indicated by blue arrow
  • Confirm visually there are files you added, indicated by green arrow
  • Fill in your commit message, indicated by orange arrow


The commit message has the same purpose as the one we filled in while editing README.md - give information about the change you did.

Once this is done, please click Commit to master. Doing this will update the main, master branch of your code. When you hit the History button on the top of GitHub for Windows GUI, you will see two commits. The original one, where we edited the README.md and the one you just did.

If you click the first commit, you will notice you can easily review the changes. Removed lines are in red and added ones in green colour.

Click image for larger version. 

Name:	GitHubForWindowsChangesInCode.png 
Views:	545 
Size:	73.7 KB 
ID:	9649

To make the changes visible online, please just hit the Sync button in the top right corner of the GUI. You can verify the success by going to your browser, and refreshing your repository page.

Summary
You learned how to create your new thinBasic project by forking the GitHub project template, adjusting the name and README.md information and finally, adding some of your own code.

With this knowledge, you can start versioning your code at GitHub. Each new change can be just commited to master and synced, giving you easy review of the changes over time.

But there is more to GitHub, stay tuned! ]]>
Petr Schreiber https://www.thinbasic.com/community/entry.php?204-Your-first-GitHub-fork
Idea for handling your project code in 2017 https://www.thinbasic.com/community/entry.php?203-Idea-for-handling-your-project-code-in-2017 Sun, 01 Jan 2017 15:10:50 GMT Your code. Your creation. Your history of thrill, discovery, ingenious solutions and proud publishing. Let me remind you of how we shared the code... Your code. Your creation. Your history of thrill, discovery, ingenious solutions and proud publishing.

Let me remind you of how we shared the code in the past, and what could be the possible direction for 2017.

For years, we got used to a simple system. When we developed piece of code worth sharing or showcasing, we pasted it as forum post, or added it as attachement.
This approach worked, and over years we learned some rules - to stick the latest version to the first post of the thread, along with instructions how to use it.

This approach works, but can be fragile at time. Why? What are the risks?

Problems of the current approach
Sometimes, you would like to take the published version and revert some changes - but you can't, because you erased the original on the hard drive.
And even if you found the backups, there is ton of them and you don't know the differences between different versions.
Sometimes, you would like to step up and create project with multiple collaborators. Fixed ZIP file is not the most reliable choice.
Sometimes, you create a project which could use an extensive documentation, laid along with the code.

I believe all of these issues can be solved by publishing the code into some form of versioning system.

There are many solutions, but I found one especially useful in the last two years - GitHub.

Enter the GitHub
GitHub is one of the services built around Git. Not helping? Git is low level tool for versioning management, it has a very powerful commandline interface, and a well crafted documentation available.

GitHub makes the work with Git simpler by providing both free online hosting for public code and Windows GUI client, which hides the complexity from you.

GitHub client stores every change you do on your code - you just snapshot the code by so called commiting from time to time, and you can both view the changes and revert to them and back at any time later.

GitHub also makes it easy to collaborate - your helpers can create branches, where they can experiment with the original code base.
Then they can propose a change to be integrated to the main branch, usually called master by making so called pull request.
This means you can stay in control of the development and integrate only changes you wish to happen.

Last but not least - you can accompany the code by a Wiki page, which is one of the many extra features the service offers, for free.

The whole code is organised in so called repository, which can be further cloned and adjusted by your collaborators. If you allow them to.

Practical example
I used GitHub for many personal and professional projects over the years. You can have a look for example at Log007, a very simplistic logger for thinBasic scripts.

You can see the project consists of few files:
  • README.md, which is GitHub standard way to give users information about the project
  • .gitattributes, which is Git configuration file, telling that we want to preserve Windows line endings in the projects
  • log007.tbasicu, which is the actual code unit, shared with the community
  • ...and unitTests directory, which contains some tests verifying the functionality


You can review the changes I did over time by looking at commits.
You can also review the documentation on attached Wiki, and also download various Releases for those, who don't want to use Git/GitHub to obtain the code.

To stay connected with our community here, I created a related forum post and I encourage to do so for your projects as well.

Getting started
If I got you interested, you may consider starting your first thinBasic GitHub project. I created a template for you, in our thinBasic repository.

All you need to do now is:


By doing so, you did the first bold step in becoming a modern thinBasic coder, with his code under complete control.
Should you have any questions, just let me know in the comments. ]]>
Petr Schreiber https://www.thinbasic.com/community/entry.php?203-Idea-for-handling-your-project-code-in-2017
UDT inheritance: Accessing base type functions https://www.thinbasic.com/community/entry.php?199-UDT-inheritance-Accessing-base-type-functions Sat, 13 Jun 2015 09:26:07 GMT ThinBASIC user defined types, also known as UDTs, offer convenient way to inherit properties and functions - via the *Extends* keyword. This... ThinBASIC user defined types, also known as UDTs, offer convenient way to inherit properties and functions - via the Extends keyword.
This approach allows inheriting properties and functions from the base UDT. This is very useful, but there is one catch one needs to be aware!

If you specify function in the new UDT which has the same name as function in the base UDT, override is performed and the whenever you use the function, the one from the new type is used.
This is expected behavior but there are situations when you still need to access the original base function. Let me illustrate this on the following example:
Type Point2D
  x As Single
  y As Single

  Function Initialize(x As Single, y As Single) 
    Me.x = x
    Me.y = y
   End Function

  Function ToString() As String
    Return StrFormat$("{1},{2}", Me.x, Me.y)
   End Function
End Type
Type Point3D Extends Point2D
  z As Single

  Function Initialize(x As Single, y As Single, z As Single) 
    Me.x = x
    Me.y = y
    Me.z = z
   End Function

  Function ToString() As String
    Return StrFormat$("{1},{2},{3}", Me.x, Me.y, Me.z)
  End Function
End Type
You can see the Point2D and Point3D differ basically just by added z property in the 3D version. What if you wanted to recycle the Point2D Initialize and ToString functionsto simply build upon them in new UDT?
If you try to call any of these, the variant for Point3D will be used, thanks to override mechanism described earlier.

So how do I access the base methods? Can it be done at all?
What helps to resolve this situation is to understand that Point3D is basically Point2D + something new. Even at binary level. If you are thinking about AT overlay now... yes, that is the answer :)
Using DIM .. AT with base type allows us to use the functions, even if they collide at name level with the new type names. And without any overhead, because overlay variables do not allocate memory.

Here comes the modified example:
Type Point2D
  x As Single
  y As Single

  Function Initialize(x As Single, y As Single) 
    Me.x = x
    Me.y = y
  End Function

  Function ToString() As String
    Return StrFormat$("{1},{2}", Me.x, Me.y)
  End Function  
End Type
Type Point3D Extends Point2D
  z As Single

  Function Initialize(x As Single, y As Single, z As Single) 
    Dim base As Point2D At VarPtr(Me) : base.Initialize(x, y)  
    Me.z = z
  End Function

  Function ToString() As String
    Dim base As Point2D At VarPtr(Me)  
    Return StrFormat$("{1},{2}", base.ToString(), Me.z)
  End Function
End Type

Please note while we did not save lines of code in this particular example, the benefit of reusing the base methods increases when solving real world problems with ThinBASIC. ]]>
Petr Schreiber https://www.thinbasic.com/community/entry.php?199-UDT-inheritance-Accessing-base-type-functions