Sunday 23 February 2014

Striptease (1996) BluRay 720p 850MB

Erin Grant loses care and custody of her female offspring once she’s single from her husband Darrell, a unimportant stealer. troubled for cash, she could be a dancer at a cabaret, wherever one night representative Dilbeck (in disguise) attacks another member of the audience.
A spectator, United Nations agency acknowledges Dilbeck and is keen on Erin, offers to urge back her female offspring by blackmailing Dilbeck. Things don’t calculate as planned, though.
Info:
Release Date: 28 June 1996 (USA)
Genre: Crime | Drama | Thriller
Stars: Demi Moore, Burt Reynolds, Armand Assante
Quality: BluRay 720p
Encoder: SHQ
Source: 720p.BluRay.x264-HD4U
Size: 850MB
*Subtitles: Indonesia, English

Download Link

*Join (gabung) filenya dg Hjsplit, caranya baca tutorial di menu PANDUAN
 

Saturday 22 February 2014

Ketika Tuhan Menciptakan Wanita


Ketika tuhan menciptakan wanita, malaikat datang dan bertanya,
"mengapa begitu lama menciptakan wanita, tuhan?"
tuhan menjawab,
"sudahkah engkau melihat setiap detail yang saya ciptakan untuk wanita?" lihatlah dua tangannya mampu menjaga banyak anak pada saat bersamaan, punya pelukan yang dapat menyembuhkan sakit hati dan keterpurukan, dan semua itu hanya dengan dua tangan".

Malaikat menjawab dan takjub,
"hanya dengan dua tangan? Tidak mungkin!
Tuhan menjawab,
"tidakkah kau tahu, dia juga mampu menyembuhkan dirinya sendiri dan bisa bekerja 18 jam sehari".

Malaikat mendekat dan mengamati wanita tersebut dan bertanya,
"tuhan, kenapa wanita terlihat begitu lelah dan rapuh seolah-olah terlalu banyak beban baginya?"
tuhan menjawab,
"itu tidak seperti yang kau bayangkan, itu adalah air mata."
"untuk apa?", tanya malaikat.

Tuhan melanjutkan,
"air mata adalah salah satu cara dia mengekspresikan kegembiraan, kegalauan, cinta, kesepian, penderitaan, dan kebanggaan, serta wanita ini mempunyai kekuatan mempesona laki-laki, ini hanya beberapa kemampuan yang dimiliki wanita.
Dia dapat mengatasi beban lebih dari laki-laki, dia mampu menyimpan kebahagiaan dan pendapatnya sendiri, dia mampu tersenyum saat hatinya menjerit, mampu menyanyi saat menangis, menangis saat terharu, bahkan tertawa saat ketakutan.
Dia berkorban demi orang yang dicintainya, dia mampu berdiri melawan ketidakadilan, dia menangis saat melihat anaknya adalah pemenang, dia girang dan bersorak saat kawannya tertawa bahagia, dia begitu bahagia mendengar suara kelahiran.
Dia begitu bersedih mendengar berita kesakitan dan kematian, tapi dia mampu mengatasinya. Dia tahu bahwa sebuah ciuman dan pelukan dapat menyembuhkan luka.
Sumber

Monday 10 February 2014

Database Management Systems (DBMS)

Database management systems (DBMS) are collections of tools used to manage databases. Four basic functions performed by all DBMS are: 
  • Create, modify, and delete data structures, e.g. tables
  • Add, modify, and delete data
  • Retrieve data selectively
  • Generate reports based on data


A short list of database applications would include:
  • Inventory
  • Payroll
  • Membership
  • Orders
  • Shipping
  • Reservation
  • Invoicing
  • Accounting
  • Security
  • Catalogues
  • Mailing
  • Medical records
  • Database Components

    Looking from the top down, databases are composed of related tables, which in turn are composed of fields and records.

    Field

    A field is an area (within a record) reserved for a specific piece of data. Examples: customer number, customer name, street address, city, state, phone, current balance.
    Fields are defined by:
    • Field name

    • Data type
      • Character: text, including such things as telephone numbers and zip codes
      • Numeric: numbers which can be manipulated using math operators
      • Date: calendar dates which can be manipulated mathematically
      • Logical: True or False, Yes or No
    • Field size
      • Amount of space reserved for storing data

    Record

    A record is the collection of values for all the fields pertaining to one entity: i.e. a person, product, company, transaction, etc.

    Table

    A table is a collection of related records. For example, employee table, product table, customer, and orders tables.
    In a table, records are represented by rows and fields are represented as columns.

    Database

    A database is a collection of related tables. It can also include other objects, such as queries, forms, and reports. The structure of a database is the relationships between its tables.

    Relationships

    There are three types of relationships which can exist between tables:
    • One-to-One
    • One-to-Many
    • Many-to-Many
    The most common relationships in relational databases are One-to-Many and Many-to-Many.
    An example of a One-to-Many relationship would be a Customer table and an Orders table: each order has only one customer, but a customer can make many orders.
    One-to-Many relationships consist of two tables, the "one" table, and the "many" table.
    An example of a Many-to-Many relationship would be an Orders table and a Products table: an order can contain many products, and a product can be on many orders.
    A Many-to-Many relationship consists of three tables: two "one" tables, both in a One-to-Many relationship with a third table. The third table is sometimes referred to as the lien.

    Key Fields

    In order for two tables to be related, they must share a common field. The common field (key field) in the "one" table of a One-to- Many relationship needs to be a primary key. The same field in the "many" table of a One-to-Many relationship is called the foreign key.

    Primary key

    A Primary key is a field or a combination of two or more fields. The value in the primary key field for each record uniquely identifies that record.
    In the example above, customer number is the Primary key for the Customer table. A customer number identifies one and only one customer in the Customer table. The primary key for the Orders table would be a field for the order number.

    Foreign key

    When a "one" table's primary key field is added to a related "many" table in order to create the common field which relates the two tables, it is called a foreign key in the "many" table.
    In the example above, the primary key (customer number) from the Customer table ("one" table) is a foreign key in the Orders table ("many" table).
    For the "many" records of the Order table, the foreign key identifies with which unique record in the Customer table they are associated.

    Rationalization and Redundancy

    Grouping logically-related fields into distinct tables, determining key fields, and then relating distinct tables using common key fields is called rationalizing a database. There are two major reasons for designing a database this way:
    • To avoid wasting storage space for redundant data
    • To eliminate the complication of updating duplicate data copies
    For example, in the Customers/Orders database, we want to be able to identify the customer name, address, and phone number for each order, but we want to avoid repeating that information for each order. To do so would take up storage space needlessly and make the job of updating multiple customer addresses difficult and time-consuming.
    To avoid redundancy:
    1. Place all the fields related to customers (name, address, etc.) into a Customer table and create a Primary key field which uniquely identifies each customer: Customer ID.
    2. Put all the fields related to orders (date, salesperson, total, etc.) into the Orders table.
    3. Include the Primary key field (Customer ID) from the Customer table in the table for Orders.
    The One-to-Many relationship between Customer and Orders is defined by the common field Customer ID. In the table for Customers (the "one" table) Customer ID is a primary key, while in the Orders table (the "many" table) it is a foreign key.

    Source

    Historical Development Of The Computer Part II

    Five Generations Computers From Time To Time

     

    First Generation (1945-1956)

    With the onset of the Second World War, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress. By 1941 German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war.
    American efforts produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was a electronic relay computer. It used electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change); but it could perform basic arithmetic as well as more complex equations.
    Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia. Developed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC, unlike the Colossus and Mark I, was a general-purpose computer that computed at speeds 1,000 times faster than Mark I.
    In the mid-1940's John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data. This "stored memory" technique as well as the "conditional control transfer," that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. Both the U.S. Census Bureau and General Electric owned UNIVACs. One of UNIVAC's impressive early achievements was predicting the winner of the 1952 presidential election, Dwight D. Eisenhower.
    First generation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation computers were the use of vacuum tubes (responsible for their breathtaking size) and magnetic drums for data storage.

    Second Generation Computers (1956-1963)

    By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors. The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named (Livermore Atomic Research Computer) and the other at the U.S. Navy Research and Development Center in Washington, D.C. Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.

    Throughout the early 1960's, there were a number of commercially successful second generation computers used in business, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers.
    It was the stored program and programming language that gave computers the flexibility to finally be cost effective and productive for business use. The stored program concept meant that instructions to run a computer for a specific function (known as a program) were held inside the computer's memory, and could quickly be replaced by a different set of instructions for a different function. A computer could print customer invoices and minutes later design products or calculate paychecks. More sophisticated high-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) came into common use during this time, and have expanded to the current day. These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer. New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second generation computers.

    Third Generation Computers (1964-1971)

    Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the computer's sensitive internal parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semiconductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory.


    Fourth Generation (1971-Present)
    After the integrated circuits, the only place to go was down - in size, that is. Large scale integration (LSI) could fit hundreds of components onto one chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration (ULSI) increased that number into the millions. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors.
    Such condensed power allowed everyday people to harness a computer's power. They were no longer developed exclusively for large business or government contracts. By the mid-1970's, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs. Pioneers in this field were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade video games such as Pac Man and home video game systems such as the Atari 2600 ignited consumer interest for more sophisticated, programmable home computers.
    In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers (which could fit inside a briefcase) to palmtop (able to fit inside a breast pocket). In direct competition with IBM's PC was Apple's Macintosh line, introduced in 1984. Notable for its user-friendly design, the Macintosh offered an operating system that allowed users to move screen icons instead of typing instructions. Users controlled the screen cursor using a mouse, a device that mimicked the movement of one's hand on the computer screen.
    As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. As opposed to a mainframe computer, which was one powerful computer that shared time with many terminals for many applications, networked computers allowed individual computers to form electronic co-ops. Using either direct wiring, called a Local Area Network (LAN), or telephone lines, these networks could reach enormous proportions. A global web of computer circuitry, the Internet, for example, links computers worldwide into a single network of information. During the 1992 U.S. presidential election, vice-presidential candidate Al Gore promised to make the development of this so-called "information superhighway" an administrative priority. Though the possibilities envisioned by Gore and others for such a large network are often years (if not decades) away from realization, the most popular use today for computer networks such as the Internet is electronic mail, or E-mail, which allows users to type in a computer address and send messages through networked terminals across the office or across the world.

    Fifth Generation (Present and Beyond)

    Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)
    Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers may be able to accept spoken word instructions and imitate human reasoning. The ability to translate a foreign language is also a major goal of fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
    Many advances in the science of computer design and technology are coming together to enable the creation of fifth-generation computers. Two such engineering advances are parallel processing, which replaces von Neumann's single central processing unit design with a system harnessing the power of many CPUs to work as one. Another advance is superconductor technology, which allows the flow of electricity with little or no resistance, greatly improving the speed of information flow. Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.


    Sources

    • Computers!, Timothy Trainor and Diane Trainor
    • Infoculture The Smithsonian Book of Information Age Inventions, Steven Lubar. Houghton Mifflin Company, 1993.
    • Alan Turing: The Enigma Andrew Hodges, 1983. Simon & Schuster, New York.
    • "Insanely Great," Steven Levy. Popular Science, February, 1994.
    • "Stevie Wonder," Joseph Nocera. GQ, October, 1993.
    • "Reading Apple's Uncertain Future," MacWorld, October, 1993.
    • "Ripe For Change," Michael Myer. Newsweek, August 29, 1994.
    • "Future Games," James K. Willcox. Popular Mechanics, December, 1993
    • "Electronic Worlds Without End," Keith Ferrell, Omni, October 1993.
    • "Mario's Big Brother," David Sheff. Rolling Stone, January 9, 1992.
    • "The PC Week Stat Sheet: A Decade of Computing," PC Week. February 28, 1994.
    • "R.I.P Commodore, 1954-1994," Tom R. Halfhill. Byte, August, 1994.
    • "Playing Catch Up…" Jim Carlton, Wall Street Journal October 17, 1994.
    • Breakthrough to the Computer Age, Harry Wulforst
    • IBM's Early Computers, Charles J. Bashe, Lyle R. Johnson, John H. Palmer, Emerson Pugh.
    • The Computer Comes of Age, R. Moreau
    • The Computer Pioneers, David Ritchie
    • Zap: The Rise and Fall of Atari, Scott Cohen
    • 1993 Grolier's Encyclopedia, Grolier Electronic Publishing, Inc.

    Historical Development Of The Computer Part I


    Inventor of First Computer Machine in the World
    The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer. This device allows users to make computations using a system of sliding beads arranged on a rack. Early merchants used the abacus to keep trading transactions. But as the use of paper and pencil spread, particularly in Europe, the abacus lost its importance. It took nearly 12 centuries, however, for the next significant advance in computing devices to emerge. In 1642, Blaise Pascal (1623-1662), the 18-year-old son of a French tax collector, invented what he called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on. The drawback to the Pascaline, of course, was its limitation to addition. 

    In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716), improved the Pascaline by creating a machine that could also multiply. Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears and dials. Partly by studying Pascal's original notes and drawings, Leibniz was able to refine his machine. The centerpiece of the machine was its stepped-drum gear design, which offered an elongated version of the simple flat gear. It wasn't until 1820, however, that mechanical calculators gained widespread use. Charles Xavier Thomas de Colmar, a Frenchman, invented a machine that could perform the four basic arithmetic functions. Colmar's mechanical calculator, the arithometer, presented a more practical approach to computing because it could add, subtract, multiply and divide. With its enhanced versatility, the arithometer was widely used up until the First World War. Although later inventors refined Colmar's calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation. 

    The real beginnings of computers as we know them today, however, lay with an English mathematics professor, Charles Babbage (1791-1871). Frustrated at the many errors he found while examining calculations for the Royal Astronomical Society, Babbage declared, "I wish to God these calculations had been performed by steam!" With those words, the automation of computers had begun. By 1812, Babbage noticed a natural harmony between machines and mathematics: machines were best at performing tasks repeatedly without mistake; while mathematics, particularly the production of mathematic tables, often required the simple repetition of steps. The problem centered on applying the ability of machines to the needs of mathematics. Babbage's first attempt at solving this problem was in 1822 when he proposed a machine to perform differential equations, called a Difference Engine. Powered by steam and large as a locomotive, the machine would have a stored program and could perform calculations and print the results automatically. After working on the Difference Engine for 10 years, Babbage was suddenly inspired to begin work on the first general-purpose computer, which he called the Analytical Engine. Babbage's assistant, Augusta Ada King, Countess of Lovelace (1815-1842) and daughter of English poet Lord Byron, was instrumental in the machine's design. One of the few people who understood the Engine's design as well as Babbage, she helped revise plans, secure funding from the British government, and communicate the specifics of the Analytical Engine to the public. Also, Lady Lovelace's fine understanding of the machine allowed her to create the instruction routines to be fed into the computer, making her the first female computer programmer. In the 1980's, the U.S. Defense Department named a programming language ADA in her honor. 

    Babbage's steam-powered Engine, although ultimately never constructed, may seem primitive by today's standards. However, it outlined the basic elements of a modern general purpose computer and was a breakthrough concept. Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also contained a "mill" with a control unit that allowed processing instructions in any sequence, and output devices to produce printed results. Babbage borrowed the idea of punch cards to encode the machine's instructions from the Jacquard loom. The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard, used punched boards that controlled the patterns to be woven. 

    In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacquard loom concept to computing. His first task was to find a faster way to compute the U.S. census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers compiled their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroughs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's. 

    In the ensuing years, several engineers made other significant advances. Vannevar Bush (1890-1974) developed a calculator for solving differential equations in 1931. The machine could solve complex differential equations that had long left scientists and mathematicians baffled. The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other. To eliminate this bulkiness, John V. Atanasoff (b. 1903), a professor at Iowa State College (now called Iowa State University) and his graduate student, Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. This approach was based on the mid-19th century work of George Boole (1815-1864) who clarified the binary system of algebra, which stated that any mathematical equations could be stated simply as either true or false. By extending this concept to electronic circuits in the form of on or off, Atanasoff and Berry had developed the first all-electronic computer by 1940. Their project, however, lost its funding and their work was overshadowed by similar developments by other scientists.

    Sunday 9 February 2014

    Virgin (2012) DVDRip 550MB

     Three different stories about people of different ages and background experiencing their first sexual contact......










    Info: 
    Dated Released: 
    12 May 2012 (Japan)
    Quality: DVDRip
    IMDB Rating: 
    2.3/10 from 13 users  
    Genre: 
    Comedy | ASIA
    Stars: 
    Kana Furuhashi, Satoshi Hirayama, Takahiro Iwasaki
    Encoder: 
    SHQ
    *Subtitle: 
    (N/A)



    DOWNLOAD LINK

     

    Malena (2000) BluRay 720p 700MB

      A woman provokes sensual awakenings in a group of adolescent boys.
    On the day in 1940 that Italy enters the war, two things happen to the 12-year-old Renato: he gets his first bike, and he gets his first look at Malèna. She is a beautiful,
    silent outsider who’s moved to this Sicilian town to be with her husband, Nino.
    He promptly goes off to war, leaving her to the lustful eyes of the men and the sharp tongues of the women. During the next few years, as Renato grows toward manhood, he watches Malèna suffer and prove her mettle. He sees her loneliness, then grief when Nino is reported dead, the effects of slander on her relationship with her father, her poverty and search for work, and final humiliations. Will Renato learn courage from Malèna and stand up for her?

    Pengertian Aplikasi



    Kata aplikasi berasal dari bahasa inggris "application" yang berarti penerapan, lamaran ataupun penggunaan. Sedangkan secara istilah, pengertian aplikasi adalah suatu program yang siap untuk digunakan yang dibuat untuk melaksanakan suatu funsi bagi pengguna jasa aplikasi serta penggunaan aplikasi lain yang dapat digunakan oleh suatu sasaran yang akan dituju (Djasaidram, 2010:12). Menurut kamus komputer eksekutif, aplikasi mempunyai arti yaitu pemecahan masalah yang menggunakan salah satu tehnik pemrosesan data aplikasi yang biasanya berpacu pada sebuah komputansi yang diinginkan atau diharapkan maupun pemrosesan data yang diharapkan.
    Suatu sistem informasi adalah seperangkat aplikasi komputer yang dapat memberi dukungan operasi dari suatu organisasi seperti operasi, instalasi serta pada saat perawatan komputer, perangkat lunak dan data. Aplikasi biasanya berupa perangkat lunak yang berbentuk software yang berisi kesatuan perintah atau program yang dibuat untuk melaksanakan sebuah pekerjaan yang diinginkan. 

    After Porn Ends (2010) DOCU 720p WEBRip 600MB bioskopkita

    After Porn Ends, is a documentary that not only examines the lives and careers of some of the biggest names in the history of the adult entertainment industry; but what happens to them after they leave the business and
    try and live the “normal” lives that millions of other Americans enjoy. They hailed from the rural South, steel towns, and the San Fernando Valley. As teenagers, and young adults, none of them thought that porn was in their future. They were artists, baseball players, child prodigies, and even Ivy Leaguers. Now, after their lives in porn; they’re TV stars, bounty hunters, writers, and social activists. What happened in between? And now that they’ve moved on, can they really live a normal life after porn?

    Celebrity Sex Tape (2012) BluRay 720p 600MB bioskopkita

    a group of college nerds secretly record a washed up celebruty having sex and post the tape on the internet, when the publicity revives the actress.s career , every B-list celebrity ,reality and rejects in holly wood wants to star in the guys next “production ”





    Download Microsoft Office 2007 Full Serial Number + Crack Activator

    Microsoft Office 2007
    Resmi disebut 2007 Microsoft Office System adalah Windows versi dari Microsoft Office System , Microsoft 's suite produktivitas . Sebelumnya dikenal sebagai Office 12 pada tahap awal dari siklus beta, kemudian dirilis untuk lisensi volume pelanggan pada November 30, 2006 dan dibuat tersedia untuk pelanggan ritel pada tanggal 30 Januari 2007. Masing-masing dibuat pada tanggal yang sama dengan Windows Vista yang dirilis dengan lisensi volume dan pelanggan ritel. Office 2007 berisi sejumlah fitur baru, yang paling penting dari yang sepenuhnya baru antarmuka pengguna grafis yang disebut Fluent User Interface  (awalnya disebut sebagai Ribbon User Interface ), menggantikan menu dan toolbar - yang telah menjadi Office landasan sejak awal - dengan toolbar tab, yang dikenal sebagai Ribbon. Office 2007 memerlukan Windows XP dengan Service Pack 2 atau lebih tinggi, Windows Server 2003 dengan Service Pack 1 atau lebih tinggi, Windows Vista atau Windows 7. Office 2007 adalah versi terakhir dari Microsoft Office yang secara resmi didukung pada Windows XP Professional x64 Edition .
    The 'User Interface Ribbon "adalah tugas yang berorientasi Graphical User Interface ( GUI ). Ini fitur tombol menu utama, dikenal luas sebagai 'Office Button'. Interface Ribbon telah ditingkatkan di Microsoft Office 2010 .

    Monday 3 February 2014

    Pengertian Ping Pada Command Prompt ( CMD )


     Ping adalah software yang berjalan di atas protokol ICMP (Internet Control Message Protocol) untuk mencek hubungan antara dua komputer di internet. Ping dapat juga berarti program dasar yang mengijinkan satu pengguna untuk mem-verifikasi bahwa alamat protokol internet tertentu ada dan dapat menerima permintaan-permintaan.


     Ping digunakan untuk memastikan bahwa satu komputer yang sedang dituju sedang aktif dan memberikan respon balik. Misalnya, bila kita ingin mengirimkan suatu file ke suatu alamat
    host, maka untuk melihat berapa lama waktu operasi yang dibutuhkan, kita menggunakan ping. (Nama “ping” datang dari sonar sebuah kapal selam yang sedang aktif, yang sering mengeluarkan bunyi ping ketika menemukan sebuah objek).




    Acehomegazen
    Luka Di Hati Bagaikan Luka Yang terbakar