|
Showing 1 - 7 of
7 matches in All Departments
For 60 years the International Federation for Information
Processing (IFIP) has been advancing research in Information and
Communication Technology (ICT). This book looks into both past
experiences and future perspectives using the core of IFIP's
competence, its Technical Committees (TCs) and Working Groups
(WGs). Soon after IFIP was founded, it established TCs and related
WGs to foster the exchange and development of the scientific and
technical aspects of information processing. IFIP TCs are as
diverse as the different aspects of information processing, but
they share the following aims: To establish and maintain liaison
with national and international organizations with allied interests
and to foster cooperative action, collaborative research, and
information exchange. To identify subjects and priorities for
research, to stimulate theoretical work on fundamental issues, and
to foster fundamental research which will underpin future
development. To provide a forum for professionals with a view to
promoting the study, collection, exchange, and dissemination of
ideas, information, and research findings and thereby to promote
the state of the art. To seek and use the most effective ways of
disseminating information about IFIP's work including the
organization of conferences, workshops and symposia and the timely
production of relevant publications. To have special regard for the
needs of developing countries and to seek practicable ways of
working with them. To encourage communication and to promote
interaction between users, practitioners, and researchers. To
foster interdisciplinary work and - in particular - to collaborate
with other Technical Committees and Working Groups. The 17
contributions in this book describe the scientific, technical, and
further work in TCs and WGs and in many cases also assess the
future consequences of the work's results. These contributions
explore the developments of IFIP and the ICT profession now and
over the next 60 years. The contributions are arranged per TC and
conclude with the chapter on the IFIP code of ethics and conduct.
For 60 years the International Federation for Information
Processing (IFIP) has been advancing research in Information and
Communication Technology (ICT). This book looks into both past
experiences and future perspectives using the core of IFIP's
competence, its Technical Committees (TCs) and Working Groups
(WGs). Soon after IFIP was founded, it established TCs and related
WGs to foster the exchange and development of the scientific and
technical aspects of information processing. IFIP TCs are as
diverse as the different aspects of information processing, but
they share the following aims: To establish and maintain liaison
with national and international organizations with allied interests
and to foster cooperative action, collaborative research, and
information exchange. To identify subjects and priorities for
research, to stimulate theoretical work on fundamental issues, and
to foster fundamental research which will underpin future
development. To provide a forum for professionals with a view to
promoting the study, collection, exchange, and dissemination of
ideas, information, and research findings and thereby to promote
the state of the art. To seek and use the most effective ways of
disseminating information about IFIP's work including the
organization of conferences, workshops and symposia and the timely
production of relevant publications. To have special regard for the
needs of developing countries and to seek practicable ways of
working with them. To encourage communication and to promote
interaction between users, practitioners, and researchers. To
foster interdisciplinary work and - in particular - to collaborate
with other Technical Committees and Working Groups. The 17
contributions in this book describe the scientific, technical, and
further work in TCs and WGs and in many cases also assess the
future consequences of the work's results. These contributions
explore the developments of IFIP and the ICT profession now and
over the next 60 years. The contributions are arranged per TC and
conclude with the chapter on the IFIP code of ethics and conduct.
This open access book presents the outcomes of the "Design for
Future - Managed Software Evolution" priority program 1593, which
was launched by the German Research Foundation ("Deutsche
Forschungsgemeinschaft (DFG)") to develop new approaches to
software engineering with a specific focus on long-lived software
systems. The different lifecycles of software and hardware
platforms lead to interoperability problems in such systems.
Instead of separating the development, adaptation and evolution of
software and its platforms, as well as aspects like operation,
monitoring and maintenance, they should all be integrated into one
overarching process. Accordingly, the book is split into three
major parts, the first of which includes an introduction to the
nature of software evolution, followed by an overview of the
specific challenges and a general introduction to the case studies
used in the project. The second part of the book consists of the
main chapters on knowledge carrying software, and cover tacit
knowledge in software evolution, continuous design decision
support, model-based round-trip engineering for software product
lines, performance analysis strategies, maintaining security in
software evolution, learning from evolution for evolution, and
formal verification of evolutionary changes. In turn, the last part
of the book presents key findings and spin-offs. The individual
chapters there describe various case studies, along with their
benefits, deliverables and the respective lessons learned. An
overview of future research topics rounds out the coverage. The
book was mainly written for scientific researchers and advanced
professionals with an academic background. They will benefit from
its comprehensive treatment of various topics related to problems
that are now gaining in importance, given the higher costs for
maintenance and evolution in comparison to the initial development,
and the fact that today, most software is not developed from
scratch, but as part of a continuum of former and future releases.
This open access book presents the outcomes of the "Design for
Future - Managed Software Evolution" priority program 1593, which
was launched by the German Research Foundation ("Deutsche
Forschungsgemeinschaft (DFG)") to develop new approaches to
software engineering with a specific focus on long-lived software
systems. The different lifecycles of software and hardware
platforms lead to interoperability problems in such systems.
Instead of separating the development, adaptation and evolution of
software and its platforms, as well as aspects like operation,
monitoring and maintenance, they should all be integrated into one
overarching process. Accordingly, the book is split into three
major parts, the first of which includes an introduction to the
nature of software evolution, followed by an overview of the
specific challenges and a general introduction to the case studies
used in the project. The second part of the book consists of the
main chapters on knowledge carrying software, and cover tacit
knowledge in software evolution, continuous design decision
support, model-based round-trip engineering for software product
lines, performance analysis strategies, maintaining security in
software evolution, learning from evolution for evolution, and
formal verification of evolutionary changes. In turn, the last part
of the book presents key findings and spin-offs. The individual
chapters there describe various case studies, along with their
benefits, deliverables and the respective lessons learned. An
overview of future research topics rounds out the coverage. The
book was mainly written for scientific researchers and advanced
professionals with an academic background. They will benefit from
its comprehensive treatment of various topics related to problems
that are now gaining in importance, given the higher costs for
maintenance and evolution in comparison to the initial development,
and the fact that today, most software is not developed from
scratch, but as part of a continuum of former and future releases.
This book constitutes the proceedings of the 25th International
Working Conference on Requirements Engineering - Foundation for
Software Quality, REFSQ 2019, held in Essen, Germany, in March
2019. The 13 full papers and 9 short papers in this volume were
carefully reviewed and selected from 66 submissions. The papers
were organized in topical sections named: Automated Analysis;
Making Sense of Requirements; Tracelink Quality; Requirements
Management (Research Previews); From Vision to Specification;
Automated Analysis (Research Previews); Requirements Monitoring;
Open Source; Managing Requirements Knowledge at a Large Scale; in
Situ/Walkthroughs (Research previews).
The first course in software engineering is the most critical.
Education must start from an understanding of the heart of software
development, from familiar ground that is common to all software
development endeavors. This book is an in-depth introduction to
software engineering that uses a systematic, universal kernel to
teach the essential elements of all software engineering methods.
This kernel, Essence, is a vocabulary for defining methods and
practices. Essence was envisioned and originally created by Ivar
Jacobson and his colleagues, developed by Software Engineering
Method and Theory (SEMAT) and approved by The Object Management
Group (OMG) as a standard in 2014. Essence is a
practice-independent framework for thinking and reasoning about the
practices we have and the practices we need. Essence establishes a
shared and standard understanding of what is at the heart of
software development. Essence is agnostic to any particular method,
lifecycle independent, programming language independent, concise,
scalable, extensible, and formally specified. Essence frees the
practices from their method prisons. The first part of the book
describes Essence, the essential elements to work with, the
essential things to do and the essential competencies you need when
developing software. The other three parts describe more and more
advanced use cases of Essence. Using real but manageable examples,
it covers the fundamentals of Essence and the innovative use of
serious games to support software engineering. It also explains how
current practices such as user stories, use cases, Scrum, and
micro-services can be described using Essence, and illustrates how
their activities can be represented using the Essence notions of
cards and checklists. The fourth part of the book offers a vision
how Essence can be scaled to support large, complex systems
engineering. Essence is supported by an ecosystem developed and
maintained by a community of experienced people worldwide. From
this ecosystem, professors and students can select what they need
and create their own way of working, thus learning how to create
ONE way of working that matches the particular situation and needs.
The first course in software engineering is the most critical.
Education must start from an understanding of the heart of software
development, from familiar ground that is common to all software
development endeavors. This book is an in-depth introduction to
software engineering that uses a systematic, universal kernel to
teach the essential elements of all software engineering methods.
This kernel, Essence, is a vocabulary for defining methods and
practices. Essence was envisioned and originally created by Ivar
Jacobson and his colleagues, developed by Software Engineering
Method and Theory (SEMAT) and approved by The Object Management
Group (OMG) as a standard in 2014. Essence is a
practice-independent framework for thinking and reasoning about the
practices we have and the practices we need. Essence establishes a
shared and standard understanding of what is at the heart of
software development. Essence is agnostic to any particular method,
lifecycle independent, programming language independent, concise,
scalable, extensible, and formally specified. Essence frees the
practices from their method prisons. The first part of the book
describes Essence, the essential elements to work with, the
essential things to do and the essential competencies you need when
developing software. The other three parts describe more and more
advanced use cases of Essence. Using real but manageable examples,
it covers the fundamentals of Essence and the innovative use of
serious games to support software engineering. It also explains how
current practices such as user stories, use cases, Scrum, and
micro-services can be described using Essence, and illustrates how
their activities can be represented using the Essence notions of
cards and checklists. The fourth part of the book offers a vision
how Essence can be scaled to support large, complex systems
engineering. Essence is supported by an ecosystem developed and
maintained by a community of experienced people worldwide. From
this ecosystem, professors and students can select what they need
and create their own way of working, thus learning how to create
ONE way of working that matches the particular situation and needs.
|
|