Interoperable interactive geometry for Europe
I forgot my login data

Report a bug

Fan club

Quick Intro Videos
click to start movie
Create A Simple
GeoGebra Resource (25Mb)
click to start movie
Filing a review
click to start movie
Find a Resource

This platform is brought to you by the intergeo project, funded under the eContent Plus programme of the European commission and by partners

General Guideline (Summary of recommendations by the reviewers of the technical review)

The consortium should develop an active risk management plan to deal with potential deviation from the work plan that may affect the eventual achievement of the project objectives.

Although quality assessment is based on post-evaluation, it is recommended that the consortium implement a monitoring process carried out by experts to ensure that contents aggregated meet a minimum quality standard.

The consortium should proactively take all necessary actions to make sure end-users will be properly involved in and contribute to final outcomes of the project.

Future internal evaluation reports should clarify how the actual impact of the project is affected by the measurable performance indicators.

The original risk analysis and risk management was focused on three main aspects: (1) unavailability of some content and IPR clearance; (2) governments reluctant to use outcomes from INTERGEO and (3) adoption of content by teachers. From the documentation reviewed it was not possible to identify actual measures to be taken in case any of these risks actually happen. In addition, new issues appeared during the evolution of the project that recommend to define an active plan continuously adapted to address these issues. A section on risk analysis should be included in future reports.

Table of Contents

Performance Indicators (high level indicators)

  • reference to...
    • DoW chapter 5.2 & 5.3
    • D8.1
    • D4.7

current numbers (Y2) vs. estimated numbers (Y2)

  • content aggregated
    • 3rd-party content (D5.4)
  • increase in access
  • increase in reuse
  • QA resources
  • registered web site users
    • external users (D5.4)
  • curriculum mappings
  • school coverage

discussion of deviations for each indicator

  • reasoning and justification of the deviation
    • D6.2: reasons for the deviation concerning quality evaluation
The main reason for the deviation from the performance indicators is the aftermath of the delays experienced on the platform. As described in D6.2 (p. 13), it has been devastating on the motivation of candidate teachers: reached too early, their first experience on the platform was a cold shower for most of them and impeded its adoption in the classroom. We over estimated as well the feasibility of maintaining a high level of implication with very few face to face confrontations and individual steering by tutors. The most successful groups of reviewers actually met face to face on a regular basis.

The role of tutors described in D6.2 which was said to be voluntary, was not picked up by any author. This role should have been actively promoted by supported actors, like hired personnel who would animate discussion forums.

Another issue was the difficulty of finding suitable people and ways to motivate them: Proposing contracts and revenues was actually counterproductive in several occasions, seen as eroding the symbolic capital of "pure" knowledge workers and introducing ranking and competition between teachers.

  • adjustment or rather relativisation of the performance indicators
The performance indicators were indeed set too high, at more than 400 reviews at year 2 and more than 600 at the project completion. Only 270 reviews were performed in November 2009 instead of the expected 400. But the system is now known to be usable and useful and the number of new reviews is increasing steadily so we might reach the target at project completion.

Anyway, we learnt from previous rounds of experimentations in the classroom that a fewer number of deeper reviews are much more profitable than numerous shallow "fan reviews" which don't provide much information. The process will simply take more time than what was at first proposed.

    • Tutor role
In order to foster deep analysis of resources, we intend to promote more explicitly the role of tutor of resources: They are people who will reach out and contact the users who have downloaded material in order to bootstrap forum discussions and community building.

    • Monitoring process
The intergeo repository can be the place for germs of resources, half-baked activities begging for improvements, but these resources, although welcome, should not come into the way of the average user who wants ready to use resources. And this can not be achieved until all resources are quality reviewed at least once. Therefore we plan to setup a monitoring board who will put an a priori quality flag on resources in order to separate the different level of completion of resources.

      • the promised performance indicators were by far unrealistic
      • the DoW mentions 90% school coverage attained in Year 3, but it refers (in the Project Summary) to "all schools" in the UE (the Project Summary describes half million teachers of science or mathematics, 90 million pupils in the 25 countries, etc..). It is evident that one can not achieve such coverage within 3 years...maybe if we were given 25 years...
      • the starting point (3000 items provided by the consortium) was never fulfilled, not by far
      • the way of measuring the number of content items was not clear (large collections could be counted as 1 or as 100 items) at the beginning (D1.9 - ProgRepM24)
      • the increase in access, reuse, users, evaluations, etc..did not take into account that one needs a stable and performing platform before inviting external visitors.

remedial actions

  • D5.4: report on the implementation & performance of these actions ==> noticeable impact?
  • unavailability of content: TOMAS: all I can suggest is to identify some topics (for insntance, going over the links in the curriculum) which do not have content (or just a few items) and then, launch a call to teachers asking them to provide such content. This could be done through the WP5 team.
  • implementation of a monitoring process carried out by experts to ensure that contents aggregated meet a minimum quality standard
    • TOMAS: Maybe we could ask WP6 to organize, monthly, a quick survey of the items that have been uploaded during the previous month. Unfortunately, it will not be much work...
  • proactive plans to make sure end-users will be properly involved in and contribute to final outcomes of the project
    • TOMAS: let me suggest (if I understand well the meaning of this point, that I interpret on the long run, after the project is over) deriving such issue through teachers' associations (or DGS users groups...)

i2g platform analysis - content & usage indicators (D4.7)

content indicators

  • total number of resources
    • number of external-links
    • simple wiki resources (lesson plan, ...)
    • collections
  • number of uploaded files with a detail per system
  • users
  • groups
  • users with blog
  • blog-entries
  • messages in groups
  • number of reviews
    • with a decomposition by overall-result
  • number of external links to i2geo (

usage indicators

  • map of Europe including the location and number of visits (daily updated)
  • number of resources played in browser
  • number of resource files downloaded
  • number of external links followed
  • number search queries
  • number of saves of reviews (creation, edition)
  • number of saves (creation, edition) of resources
  • number of deletions of resource
  • number of branches of resource (copy action for the user to appropriate)

additional impact indicators

DGS usage indicator

  • DGS in national curricula (reference to forthcoming deliverable D5.5)
  • increase of number of DGS workshops? (annual national congress of the association of math teachers) ==> COLETTE?

Conclusions & Consequences


  • the project design made synchronical four actions that usually require different and sequential times:
    • building up the framework for a repository – filling up the repository – disseminating and using the repository – showing the influence of the repository
    • we are at phase 1.5 and that one can not expect measuring the success of the project by the outcomes of phases 2, 3 and 4....Yet
    • the project has achieved well the building up of the repository framework (including, for instance, the quality framework or the GeoSkills ontology, or the curriculum mapping, etc..). The number of LUMs and conferences, training sessions, publications related to Intergeo, etc. is larger than initially promised.
    • my local perception: in Cantabria (and in most of Spain, I will say) a new curriculum has been launched (since school year 07-08 for Primary and Secondary School and 08-09 for Upper Secondary School) with explicit reference to Dynamic Geometry at each of the three levels: Primary, Secondary and Upper Secondary. This school year 09-10 there are 1000 (one thousand) teachers (from all over Spain) following on-line courses on GeoGebra, launched by the Ministerio de Educacion. Two GeoGebra teachers' associations have been founded (in 2009) and three more are expected on 2010...These are facts without precedents in our school system history. Most of the persons involved in this trend know about Intergeo...and some are directly involved in the project.
    • Maybe some similar stories, well documented, could be an alternative way of showing the success of Intergeo regarding, at least, the influence in the increasing use of DGS.