Saturday, November 8, 2014

Overall Experience in the 2nd quarter

I am so glad that I was able to learn so many things in Kumsay I. We learned about HTML, Powerpoint, Netiquettes, and Protocols and DNS. 

As a whole, I can say that we all learned to be patient and hardworking to do our outputs in this subject. For instance, in making HTML, we need to have our focus on every little detail because one wrong move can lead to another mistake. We also learned to be careful at all times. For example, we need to think before we type or click as part of the netiquettes. Most of all, we learned to do our best and exert enough effort in everything we do because you will never know what you can be if you will never know what you can do (Alvarez,2014)

:)

This is Andreya Marie Rubio Alvarez, signing off. 
Until next time!!!

How to Present Power Point Presentations


A Good Powerpoint Presentation -- Tips

A Power Points' cause is to present data in a simple yet understandable visual effects. With that, we need to follow some rules to have a presentable output. 

1. Use legible font style and font size
2. Have proper color schemes. Consider the color theory,main groups, relationships of colors, and based on emotions.
3. Keep it simple.  "Simplicity is beauty",as the cliché goes
4. Minimize numbers in slides. Instead, use graphics to convey your point. 
5. Be brief. Don't be too wordy

Try and follow these rules to come up with better power point presentations. :)

Sublime Text and Notepad

In doing web pages, we use Sublime Text or Notepad. These two are the best! 




HTML Defined and Described

  •  HTML is a language for describing web pages.
  •  HTML stands for Hyper Text Markup Language
  •  HTML is a markup language
  •  A markup language is a set of markup tags
  •  The tags describe document content 
  •  HTML documents contain HTML tags and plain text.
  •  HTML documents are also called web pages.    
For me, HTML is such a cool thing to do. Before, i thought that making webpages are as hard as the hardest rock in the world. Nah, just kidding. I am so glad that we are now able to make different graphics and content using HTML.

In this quarter, we also had our own HTML using Sumblime Text. This is my passed work :)

I chose the topic about Ballet because it became a part of my life for many years. 

Blog

Home

Profile

It may look really complicated at first, but as soon as you get it already, you can now cope up with the flow or even go for the more complicated ones. :D

Friday, November 7, 2014

Domain Name System

This video can explain about the Domain Name System =)


How does the Internet work?

View the video to learn how.

Web Searches

When we search in the web. there are some tips and tricks to get the wanted and relevant results.

Here are some of them.

1. Don't be too wordy. In searching, less is more. The engine can easily look for what you want when you are not loquacious. 
2. Don't worry about cases. Search isn't case sensitive.
3. Search within a site. For example: History site:history.com
4. Search by file type. For example: The Filipino People filetype:doc
5. Get definitions. For example: define: science

These tricks will help us lead to accurate and precise results.

Protocols

In this quarter,we also learned about the different regulations in inter networking. 

A protocol is a set of rules that governs the communications between computers on a network. These rules include guidelines that regulate the following characteristics of a network: access method, allowed physical topologies, types of cabling, and speed of data transfer.

There are different protocols. The following are some of them.

Internet Protocol is the primary network protocol used on the Internet, On the Internet and many other networks, IP is often used together with the Transmission Control Protocol (TCP) and referred to interchangeably as TCP/IP.

Transmission Control Protocol-  is connection-oriented network protocol used by major Internet applications such as the World Wide Web, email, remote administration and file 
transfer.

User Datagram Protocol- is a connectionless-oriented network protocol that allows computer applications to send messages, in this case referred to as datagrams, to other hosts on an Internet Protocol (IP) network.

Thanks to these protocols, we are having an organized and proper internet.

Friday, October 31, 2014

The 2nd Quarter

As the 2nd quarter started, I felt a little bit of nervous because I know that the lessons will become more challenging or difficult. On the brighter side, there will also be more hands-on activities that will let us learn more. I am glad that we learned so much during this quarter and we can also apply them to the things we do in our daily lives. We are now able to do HTML, power point and can apply netiquettes. Indeed, Computer Science lead us down to new paths. :)


Saturday, August 30, 2014

Computer Science: Learning in the Techno World


We are always saying that computers are really important. It gives us so many advantages that only computers can give to us. It is time for us to realize what computers really are, and what they have contributed to us, all throughout those years with the evolution of it.

As a student, computers helped us a LOT. Actually, it already became a part of an average students’ life. Because of it, we are able to research about the different things we have to know, and reflecting with those things that we knew. Especially to computer science, we are learning utilizing the gadget that we always use. Also, with the use of moodle, we are able to keep track of lessons, assignments, projects, and the best of all; we don’t anymore need to use the old pen-paper method in testing because all of our tests are posted in moodle. It also enhances our creativity because in every document, we need to let our creative juices flow to have a remarkable output. Computer science also developed patience and being discoverers.


The modern world is full of surprises. And… who knows? Maybe the students of Philippine Science High School- Bicol Region Campus will be the next to invent another state-of-the-art gadget that can help the society just like the computer. When we learn more, we can DO more. Indeed, computer science gave us the chance to learn in the techno world.

WE LOVE COMPUTER SCIENCE!





Friday, August 29, 2014

Microsoft Word

In our class, I learned many new things about Word. Before, I thought that I already knew almost everything about Word. But WE realized that we still need to master it.

Of course, before everything else, sir reviewed us about basic doings in this application. It is all about changing the font size, font style, putting equations, symbols and many more.

After that, we learned on how to insert headers, footers, and page numbers. They are all found under the “insert” tab. Then, we learned how to apply the table of contents, list of figures, and bibliography by knowing the citations, and entries. After a few days, we learned how to start a mail merge, and how to send a message to different recipients.


Computer Science, IT and ICT

Computer Science- is the study of the computers, software and hardware. Computer science is the scientific and practical approach to computation and its applications. It is the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information, whether such information is encoded as bits in a computer memory or transcribed in genes and protein structures in a biological cell. A computer scientist specializes in the theory of computation and the design of computational systems.             Source



Information Technology (IT) is the application of computers and telecommunications equipment to store, retrieve, transmit and manipulate data, often in the context of a business or other enterprise. The term is commonly used as a synonym for computers and computer networks, but it also encompasses other information distribution technologies such as television and telephones. Several industries are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, e-commerce and computer services.                          Source

Information and Communications Technology (ICT)-  is often used as an extended synonym for information technology (IT), but is a more specific term that stresses the role of unified communications and the integration of telecommunications (telephone lines and wireless signals), computers as well as necessary enterprise software, middleware, storage, and audio-visual systems, which enable users to access, store, transmit, and manipulate information. The term ICT is also used to refer to the convergence of audio-visual and telephone networks with computer networks through a single cabling or link system. There are large economic incentives (huge cost savings due to elimination of the telephone network) to merge the telephone network with the computer network system using a single unified system of cabling, signal distribution and management.                                source




Basically, Computer Science is a field of study for computers, information technology is the application of its study, and information and communications technology is the way we distribute the collected information. 

Strength and Limitations of Computers

YES. Computers definitely has many strengths... but, they also have weaknesses or limitations.

STRENGTHS

  • It is able to do specific tasks in a matter of seconds that an average human being cannot do
  • It makes our work easier and faster
  • We are able to learn many things using this gadget.
WEAKNESS
  • The weakness of computers is that the ability to process things faster has forced human beings to work faster, harder, and longer, and to become dependent on the computers for information, and even to the point of taking the place of physical social interaction.
  • It causes people to take more time in using computers than to do other sensible things that they can do throughout the day.
So, always remember to balance your time with yourself and with the use of gadgets.

Graphical User Interface and Command Prompt

Graphical User Interface

In computing, a graphical user interface is a type of interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs),which require commands to be typed on the keyboard.

The actions in a GUI are usually performed through direct manipulation of the graphical elements As well as computers, GUIs can be found in hand-held devices such as MP3 players, portable media players, gaming devices and smaller household, office and industry equipment. The term "GUI" tends not to be applied to other low-resolution types of interfaces with display resolutions, such as video games (where HUD is preferred), or not restricted to flat screens, like volumetric displays because the term is restricted to the scope of two-dimensional display screens able to describe generic information, in the tradition of the computer science research at the PARC (Palo Alto Research Center).

The graphical user interface is presented (displayed) on the computer screen. It is the result of processed user input and usually the primary interface for human-machine interaction. The touch user interfaces popular on small mobile devices are an overlay of the visual output to the visual input.
Designing the visual composition and temporal behavior of a GUI is an important part of software application programming in the area of human-computer interaction. Its goal is to enhance the efficiency and ease of use for the underlying logical design of a stored program, a design discipline known as usability. Methods of user-centered design are used to ensure that the visual language introduced in the design is well tailored to the tasks.

The visible graphical interface features of an application are sometimes referred to as "chrome" or "Gui" (Goo-ee). Typically, the user interacts with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold. The widgets of a well-designed interface are selected to support the actions necessary to achieve the goals of the user. A model-view-controller allows for a flexible structure in which the interface is independent from and indirectly linked to application functionality, so the GUI can be easily customized. This allows the user to select or design a different skin at will, and eases the designer's work to change the interface as the user needs evolve. Good user interface design relates to the user, not the system architecture.

Large widgets, such as windows, usually provide a frame or container for the main presentation content such as a web page, email message or drawing. Smaller ones usually act as a user-input tool.

A GUI may be designed for the requirements of a vertical market as application-specific graphical user interfaces. Examples of application-specific GUIs are:

Automated teller machines (ATM)
Point-Of-Sale touchscreens at restaurants 
Self-service checkouts used in a retail store
Airline self-ticketing and check-in
Information kiosks in a public space, like a train station or a museum
Monitors or control screens in an embedded industrial application which employ a real time operating system (RTOS).
The latest cell phones and handheld game systems also employ application specific touchscreen GUIs. Newer automobiles use GUIs in their navigation systems and touch screen multimedia centers.   Source




COMMAND PROMPT

 The command prompt is used to search, manage and create or delete files in the file explorer. There are also different codes to be used in this. For example, the CD, MD, DIR and REN are some of these codes. 


Command Prompt

We realized that we can use two ways in managing files. In the command prompt, we just need to understand the different codes so that we can do the task correctly.

Watson: Super Computer

Because of too many new inventions that are made, here we come again to another one. IBM Watson. It is really advanced because it can answer a specific question in a matter of seconds only. A state-of-the-art technology. See how this supercomputer can make it true.

Why Computers are Important...


As far as I know, computers had a big part in this modern world. It did not only bring advancements but advantages as well. First of all, computers increased the development of every society because it helps in gauging the needs we need to fulfill. It also contributes to the development of our country. It is also already a part of our education. Because of computers, we are opening new doors and understanding more opportunities.

 Indeed, help shape a child's future.

Common Information and Communications Technology (ICT) Software

Another lesson that we learned about Computer Science are:
  • File Organization
  • Office Application
  • Web Services
FILE ORGANIZATIONS
             File Organization, also known as file system, is a system that an operating program use to organize files

Some systems interact smoothly with the operating system but provide more features, such as improved backup procedures and stricter file protection.

Office Applications
         Software that is used in business such as word processing, spreadsheet, database management and e-mail. Common office applications are widely available in a packaged set from vendors.They are also available in packages or suites.

Microsoft Word 

Microsoft Excel

Microsoft PowerPoint


Microsoft Publisher

Web Services
       They are applications or software that use internet to operate and produce the output.
~~Web Browser- it is used to find web pages, images, video and other files.
~~E-mail Software- for creating, sending, receiving and organizing electronic mail, or email.
~~Instant Messaging Software- enables you to have a private chat with another person to communicate in real time over the Internet. 

We realized that it is really important for us to know the different software because we can never do the things we are capable to do today without them.









Sunday, August 24, 2014

History of Computers

Computers are commonly used by most people. But, haven't you thought about what computers are before? Was it really like what we have now?   But, what is the history of computers?

The earliest computer was the abacus, used to perform basic arithmetic operations. It is basically a calculating tool, used centuries ago before the system we are using now was invented. But it is still widely used by people from some parts of the world.


This is the abacus.

First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors.
The first non-general purpose computer was ABC (Atanasoff–Berry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC.

The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.
The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.

First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.
The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.
First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.
The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didn’t contain a microprocessor (being based on a multi-chip CPU design instead), and wasn’t meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days.
First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers.
However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didn’t come as a kit then it would be Micral N, which used Intel 8008 microprocessor.
Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple.

In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC.
As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen.
The nature of the underlying electronic components didn’t change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intel’s co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as “Moore’s Law”, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs.
The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about.

Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasn’t ready to dive into the computer market and didn’t see the potential of what they had early enough.
Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didn’t take hold.
It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse.
Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day.
Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as “DOS” or “Disk Operating System”. Macintosh, with its graphical user interface, was meant to dislodge IBM’s dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI.

The first laptop that was commercialized was Osborne 1 in 1981, with a small 5″ CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hinge-mounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc.
As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced.
The first portable computers which resemble modern laptops in features were Apple’s Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBM’s ThinkPad was largely inspired by Powerbook’s design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pro’s.
Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

source: http://www.historyofcomputer.org/

Saturday, August 23, 2014

First Class (computers)

On our first class in computer science, we were asked to introduce ourselves, and the things we do using computers. Most of us said that we use computers for of course, academic purposes, researching, playing games and using social networking sites. For me, it is really an advantage that we already have gadgets in this modern and technologically inclined world. Computers are known to be machines that can do tasks automatically.

This picture shows a computer. This was a late version of computers but now, there are modern versions. It was indeed a blessing for us that there is something that can make our activities faster, easier and less complicated. Thanks to the computers.