Monday, October 22, 2007

Upload and Retrive Image or File to Database using ASP.NET 2.0

As a developer, you might face few requirements where you want to upload large documents, PDF's and images from your application. Then how do you manage and store such large data? Usually, traditional approach was to store those large files on web server's file system. But you also have database approach which allows you to store those large documents like PDF's, .zip files, images etc., as binary data directly in the database itself. Let's elaborate on Database approach a bit further. How do we usually store large data objects in Databases like SQL Server 2000? Ok, SQL server 2000 supports exclusive image data type to hold image data. Now SQL Server 2005 supports another new data type varbinary which allows storing binary data up to 2GB in size.

Even with new data types, we still need to understand that working with binary data is not the same as straight forward working with text data. So, we are here to discuss how to use ASP.NET 2.0 SqlDataSource control to store and retrieve image files directly from a database.

We will create application which allows user to upload images and display the uploaded pictures. The uploaded images will be stored in database as binary data. To hold image data, we need to create new table called PictureTable as shown below

http://www.beansoftware.com/ASP.NET-Tutorials/Images/Binary-Data-Database.jpg




if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[PictureTable]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [dbo].[PictureTable]
GO

CREATE TABLE [dbo].[PictureTable] (
[ImageID] [int] IDENTITY (1, 1) NOT NULL ,
[Title] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[DateAdded] [datetime] NOT NULL ,
[MIMEType] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,
[Image] [image] NOT NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Schema script for PictureTable

This table records details of pictures and content. The PictureTable table's MIMEType field holds the MIME type of the uploaded image (image/jpeg for JPG files, image/gif for GIF files, and so on); the MIME type specifies to the browser how to render the binary data. The Image column holds the actual binary contents of the picture.

<asp:Label ID="Label1" runat="server" Text="Upload Image"</asp:Label>
<asp:Label ID="Label2" runat="server" Text="Title"></asp:Label>
<asp:TextBox ID="TextBox1" runat="server"></asp:TextBox>
<asp:Label ID="Label3" runat="server" Text="Image"></asp:Label>
<asp:FileUpload ID="FileUpload1" runat="server" />
<asp:Button ID="Button1" runat="server" Text="Upload"/>

http://www.beansoftware.com/ASP.NET-Tutorials/Images/Binary-Data-Upload-Images.jpghttp://www.beansoftware.com/ASP.NET-Tutorials/Images/Binary-Data-Upload-Images.jpg

As shown above, we are using Fileupload control to browse picture files on hard disk. FileUpload control is a composite control which includes a textbox and browse button together. To add this control, simply drag and drop FileUpload control from Toolbox as shown below.



http://www.beansoftware.com/ASP.NET-Tutorials/Images/Binary-Data-Toolbox.jpgFile upload control on toolbox

Once user selects appropriate picture file using FileUpload control, click upload button which inserts selected image into PictureTable as new record. The logic to insert the image into PictureTable is handled in Click event of Upload button as shown below.

Protected Sub Upload_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Upload.Click
Dim fileUpload1 As FileUpload = CType(Me.FindControl("fileUpload1"), FileUpload)
'Make sure a file has been successfully uploaded
If fileUpload1.PostedFile Is Nothing OrElse String.IsNullOrEmpty(fileUpload1.PostedFile.FileName) OrElse fileUpload1.PostedFile.InputStream Is Nothing Then
Label1.Text = "Please Upload Valid picture file"
Exit Sub
End If
'Make sure we are dealing with a JPG or GIF file
Dim extension As String = System.IO.Path.GetExtension(fileUpload1.PostedFile.FileName).ToLower()
Dim MIMEType As String = Nothing
Select Case extension
Case ".gif"
MIMEType = "image/gif"
Case ".jpg", ".jpeg", ".jpe"
MIMEType = "image/jpeg"
Case ".png"
MIMEType = "image/png"
Case Else
'Invalid file type uploaded
Label1.Text = "Not a Valid file format"
Exit Sub
End Select
'Connect to the database and insert a new record into Products
Using myConnection As New SqlConnection(ConfigurationManager.ConnectionStrings("ImageGalleryConnectionString").ConnectionString)
Const SQL As String = "INSERT INTO [Pictures] ([Title], [MIMEType], [Image]) VALUES (@Title, @MIMEType, @ImageData)"
Dim myCommand As New SqlCommand(SQL, myConnection)
myCommand.Parameters.AddWithValue("@Title", TextBox1.Text.Trim())
myCommand.Parameters.AddWithValue("@MIMEType", MIMEType)
'Load FileUpload's InputStream into Byte array
Dim imageBytes(fileUpload1.PostedFile.InputStream.Length) As Byte
fileUpload1.PostedFile.InputStream.Read(imageBytes, 0, imageBytes.Length)
myCommand.Parameters.AddWithValue("@ImageData", imageBytes)
myConnection.Open()
myCommand.ExecuteNonQuery()
myConnection.Close()
End Using
End Sub

Once the user has selected a file and posted back the form by clicking the "Upload" button, the binary contents of the specified file are posted back to the web server. From the server-side code, this binary data is available through the FileUpload control's PostedFile.InputStream property.

This event handler starts off by ensuring that a file has been uploaded. It then determines the MIME type based on the file extension of the uploaded file. You can observe how @ImageData parameter is set. First, a byte array named imageBytes is created and sized to the Length of the InputStream of the uploaded file. Next, this byte array is filled with the binary contents from the InputStream using the Read method. It's this byte array that is specified as the @ImageData's value.

Displaying binary Data

Regardless of what technique you employ to store the data in the database, in order to retrieve and display the binary data we need to create a new ASP.NET page for this task. This page, named DisplayPicture.aspx, will be passed ImageID through the Querystring parameter and return the binary data from the specified product's Image field. Once completed, the particular picture can be viewed by browsing the link to view uploaded images. For example
http://localhost:3219/BinaryDataVb/Displaypicture.aspx?ImageID=5.

Therefore, to display an image on a web page, we can use an Image control whose ImageUrl property is set to the appropriate URL.

Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Dim ImageID As Integer = Convert.ToInt32(Request.QueryString("ImageID"))

'Connect to the database and bring back the image contents & MIME type for the specified picture
Using myConnection As New SqlConnection(ConfigurationManager.ConnectionStrings("NorthwindConnection").ConnectionString)

Const SQL As String = "SELECT [MIMEType], [Image] FROM [PictureTable] WHERE [ImageID] = @ImageID"
Dim myCommand As New SqlCommand(SQL, myConnection)
myCommand.Parameters.AddWithValue("@ImageID", ImageID)

myConnection.Open()
Dim myReader As SqlDataReader = myCommand.ExecuteReader

If myReader.Read Then
Response.ContentType = myReader("MIMEType").ToString()
Response.BinaryWrite(myReader("Image"))
End If

myReader.Close()
myConnection.Close()
End Using

End Sub
Code listing for DisplayPicture.aspx

http://www.beansoftware.com/ASP.NET-Tutorials/Images/Binary-Data-QueryString.jpg

picture using QueryString parameter

The DisplayPicture.aspx does not include any HTML markup in the .aspx page. In the code-behind class's Page_Load event handler, the specified Pictures row's MIMEType and Image are retrieved from the database using ADO.NET code. Next, the page's ContentType is set to the value of the MIMEType field and the binary data is emitted using Response.BinaryWrite(Image): When DisplayPicture.aspx page complete, the image can be viewed by either directly visiting the URL.

On the same way, we could call .zip, .pdf or any other binary file, stored in database system. If you want to get .zip file from database, you don't need to use Image control. Instead of using src parameter of Image control, for .zip and .pdf files use href parameter of hyperlink tag (e.g. <a href="GetZipFromDB.aspx?id=5">Get great zip file</a> )

Kamen Rider Blade

Well, I was watching this movie and it’s the coolest kamen rider movie of all.

 

SQL Server 2005 Data Mining

SQL Server 2005 Data Mining

Introduction

The Microsoft SQL Server 2005 Data Mining Platform introduces significant capabilities to address data mining in both traditional and new ways. In traditional terms, data mining can predict future results based on input, or attempt to find relationships among data or cluster data in previously unrecognized yet similar groups.

Microsoft data mining tools are different from traditional data mining applications in significant ways. First, they support the entire development lifecycle of data in the organization, which Microsoft refers to as Integrate, Analyze, and Report. This ability frees the data mining results from the hands of a select few analysts and opens those results up to the entire organization. Second, SQL Server 2005 Data Mining is a platform for developing intelligent applications, not a stand-alone application. You can build custom applications that are intelligent, because the data mining models are easily accessible to the outside world. Further, the model is extensible so that third parties can add custom algorithms to support particular mining needs. Finally, Microsoft data mining algorithms can be run in real time, allowing for the real-time validation of data against a set of mined data.

Creating Intelligent Applications

The concept behind creating intelligent applications is to take the benefits of data mining and apply them to the entire data entry, integration, analysis, and reporting process. Most data mining tools show predictions of future results and help determine relationships between different data elements. Most of these tools are run against the data and produce results which are then interpreted separately. Many data mining tools are stand-alone applications that exist for the purpose of forecasting demand or identifying relationships and their functionality stops there.

Intelligent applications take the output of data mining and apply that as input to the entire process. One example of an application that makes use of a data mining model would be a data entry form for accepting personal information. Users of the application can enter a tremendous amount of data, such as birth date, gender, education level, income level, occupation, and so forth. Certain combinations of attributes don't make logical sense; for example, a seven-year-old person working as a doctor and holding a high-school diploma indicates someone is either filling in random data or showing their inability to handle data input forms. Most applications try to handle such issues by implementing complicated and deeply nested logic, but realistically it is nearly impossible to handle all such combinations of data that are valid or invalid.

To solve this problem, a business can use data mining to look at existing data and build rules for what looks valid. Each combination is scored with a level of confidence. The organization can then build the data entry application to use the data mining model for real-time data entry validation. The model scores the input against the universe of existing data and returns a level of confidence in the input. The application can then decide whether or not to accept the input based on a pre-determined level of confidence threshold.

This example points out the advantage of using a data mining engine that can run in real time: applications can be written that take advantage of the power of data mining. Rather than data mining being the end result, it becomes a part of the overall process and plays a role at each phase of integration, analysis, and reporting.

While validating input uses data mining at the front end of the data integration process, data mining can be used in the analysis phase as well. Data mining provides the ability to group or cluster values, such as similar customers or documents based on keywords. These clusters can then be fed back into the data warehouse so that analysis can be performed using these groupings. Once the groupings are known and fed back into the analysis loop, analysts can use them to look at data in ways that were not possible before.

One of the primary goals of intelligent applications is to make the power of the data mining models available to anyone, not just analysts. In the past, data mining has been the domain of experts with backgrounds in statistics or operations research. The data mining tools were built to support such users, but not to easily integrate with other applications. Thus, the ability to use data mining information was greatly restricted outside of the data mining product itself. However, with a tool that spans the entire process and opens up its models and results to other applications, businesses have the power to create intelligent applications that use data mining models at any stage.

Another aspect of a platform that allows for the creation of intelligent applications is a centralized server to store the data mining models and results. These models tend to be highly proprietary and secret. Storing them on the server protects them from being distributed outside of the organization. An added benefit is that with a shared location for models, companies have a single version of each model, not multiple variants residing on each analyst's desktop. Having a single version of the truth is one of the goals of data warehousing, and this concept can be extended to data mining so that there is a single version of the model that has been created and tuned for the particular business.

 

Regards,

Sankata
PT. Ecomindo Saranacipta
Gedung YDAP Denta Medika, 4th floor
Jl. Raya Pasar Minggu No. 45
Jakarta Selatan, Indonesia

Office : +6221 7900909

E-mail : sankata.ec@ecomindo.com

Blog : http://sankatalee.blogspot.com
ym: sankatalee | gtalk: sankatalee

 

Friday, October 19, 2007

Cerita ttg 12.5 Dollar

Kejadian ini diceritakan terjadi pada tahun 1920 di Amerika Serikat.
Saat itu seorang anak berusia 9 tahunan bermain bola di komplek
rumahnya. Entah mengapa ia menendang bola terlalu keras dan mengenai
kaca jendela sebuah rumah di komplek tersebut. Tak ayal kaca jendela
itu pecah berkeping, sementara sang pemilik rumah marah besar.

Pemilik rumah minta ganti rugi untuk kaca yang sudah dipecahkan
sebesar 12,5 dollar tak peduli siapapun pelakunya, seorang anak
kecil sekalipun. Pada saat itu uang sebesar 12,5 dollar nilainya
sangat besar, setara dengan 125 ekor ayam betina. Anak itu merasa
takut sekaligus bingung karena tak punya uang.

Dengan wajah murung ia pulang meminta bantuan uang kepada
ayahnya. "Kamu harus bertanggungjawab atas kesalahanmu. Jangan minta
orang lain menanggung kesalahanmu, " kata ayahnya.

"Tapi, saya tidak punya uang," jawab anak itu memelas.

"Saya akan meminjamkan uang untuk kamu. Tapi kamu harus
mengembalikannya dalam satu tahun," kata si ayah mengajukan syarat.

"Baiklah!" jawab anak itu senang.

Setelah itu si anak bekerja sampingan untuk mendapatkan uang. Dalam
waktu 6 bulan ia berhasil mengumpulkan uang sebesar 12,5 dollar dan
melunasi hutang kepada ayahnya. Si ayah sangat senang menerima uang
dari anaknya. "Kamu memang seorang ksatria sejati!" puji ayahnya
sambil tersenyum.

Pada masa selanjutnya tersiar kabar bahwa anak kecil itu berhasil
menjadi presiden Amerika ke 40. Ia memerintah Amerika sejak tahun
1981 hingga 1989. Kisah tersebut adalah salah satu kejadian yang
pernah dialami oleh Ronald Wilson Reagan (1911 – 2004).

Pesan:

Sikap bertanggung jawab adalah modal meraih keberhasilan di bidang
apapun. Berdasarkan kisah di atas kita dapat melihat bahwa sikap
tanggung jawab mendidik seseorang berusaha mengatasi masalah dan
memperbaiki diri. Didikan dari orang tua menjadikan Ronald Reagan
kecil tumbuh menjadi pribadi yang bertanggung jawab.

Selama masa kepemimpinannya, Ronald Reagan dinilai cukup berhasil
memajukan perekonomian Amerika Serikat. Dirinya juga dicintai
rakyat. Hal itu nampak jelas dari kesedihan yang luar biasa dari
sebagian besar rakyat Amerika ketika ia meninggal dunia pada tanggal
5 Juni 2004.

Masing-masing diantara kita sudah pasti dituntut bersikap
bertanggung jawab, entah terhadap diri sendiri, keluarga,
lingkungan, pekerjaan dan lain sebagainya. "Hari dimana Anda
menerima tanggungjawab dan berhenti mencari alasan, di hari itulah
Anda mulai melangkah menuju sukses," O.J. Simpson. Menjalankan
tanggung jawab dengan baik mendidik diri kita untuk senantiasa
melakukan segala hal dengan baik dan benar.

Sebagaimana filsuf seperti Aristotle mengatakan, "Kualitas bukan
suatu tindakan, tapi suatu kebiasaan." Bila kita selalu membiasakan
diri bersikap bertanggung jawab, maka sikap tersebut akan menjadi
ciri khas dan memberi manfaat sangat besar terhadap diri kita.
Misalnya seseorang yang selalu bertanggung jawab, maka ia akan
memiliki satu jiwa luhur dan satu kekuatan semangat kerja yang luar
biasa. Dengan kata lain, membiasakan diri bersikap tanggung jawab
memberikan dampak positif lebih besar dibandingkan kerugiannya

 

Regards,

Sankata
PT. Ecomindo Saranacipta
Gedung YDAP Denta Medika, 4th floor
Jl. Raya Pasar Minggu No. 45
Jakarta Selatan, Indonesia

Office : +6221 7900909

E-mail : sankata.ec@ecomindo.com

Blog : www.sankatalee.blogspot.com
ym: sankatalee | gtalk: sankatalee

 

Data Mining - Introduction

Overview

Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.

Continuous Innovation

Although data mining is a relatively new term, the technology is not. Companies have used powerful computers to sift through volumes of supermarket scanner data and analyze market research reports for years. However, continuous innovations in computer processing power, disk storage, and statistical software are dramatically increasing the accuracy of analysis while driving down the cost.

Example

For example, one Midwest grocery chain used the data mining capacity of Oracle software to analyze local buying patterns. They discovered that when men bought diapers on Thursdays and Saturdays, they also tended to buy beer. Further analysis showed that these shoppers typically did their weekly grocery shopping on Saturdays. On Thursdays, however, they only bought a few items. The retailer concluded that they purchased the beer to have it available for the upcoming weekend. The grocery chain could use this newly discovered information in various ways to increase revenue. For example, they could move the beer display closer to the diaper display. And, they could make sure beer and diapers were sold at full price on Thursdays.

Data, Information, and Knowledge

Data

Data are any facts, numbers, or text that can be processed by a computer. Today, organizations are accumulating vast and growing amounts of data in different formats and different databases. This includes:

  • operational or transactional data such as, sales, cost, inventory, payroll, and accounting
  • nonoperational data, such as industry sales, forecast data, and macro economic data
  • meta data - data about the data itself, such as logical database design or data dictionary definitions

Information

The patterns, associations, or relationships among all this data can provide information. For example, analysis of retail point of sale transaction data can yield information on which products are selling and when.

Knowledge

Information can be converted into knowledge about historical patterns and future trends. For example, summary information on retail supermarket sales can be analyzed in light of promotional efforts to provide knowledge of consumer buying behavior. Thus, a manufacturer or retailer could determine which items are most susceptible to promotional efforts.

Data Warehouses

Dramatic advances in data capture, processing power, data transmission, and storage capabilities are enabling organizations to integrate their various databases into data warehouses. Data warehousing is defined as a process of centralized data management and retrieval. Data warehousing, like data mining, is a relatively new term although the concept itself has been around for years. Data warehousing represents an ideal vision of maintaining a central repository of all organizational data. Centralization of data is needed to maximize user access and analysis. Dramatic technological advances are making this vision a reality for many companies. And, equally dramatic advances in data analysis software are allowing users to access this data freely. The data analysis software is what supports data mining.

What can data mining do?

Data mining is primarily used today by companies with a strong consumer focus - retail, financial, communication, and marketing organizations. It enables these companies to determine relationships among "internal" factors such as price, product positioning, or staff skills, and "external" factors such as economic indicators, competition, and customer demographics. And, it enables them to determine the impact on sales, customer satisfaction, and corporate profits. Finally, it enables them to "drill down" into summary information to view detail transactional data.

With data mining, a retailer could use point-of-sale records of customer purchases to send targeted promotions based on an individual's purchase history. By mining demographic data from comment or warranty cards, the retailer could develop products and promotions to appeal to specific customer segments.

For example, Blockbuster Entertainment mines its video rental history database to recommend rentals to individual customers. American Express can suggest products to its cardholders based on analysis of their monthly expenditures.

WalMart is pioneering massive data mining to transform its supplier relationships. WalMart captures point-of-sale transactions from over 2,900 stores in 6 countries and continuously transmits this data to its massive 7.5 terabyte Teradata data warehouse. WalMart allows more than 3,500 suppliers, to access data on their products and perform data analyses. These suppliers use this data to identify customer buying patterns at the store display level. They use this information to manage local store inventory and identify new merchandising opportunities. In 1995, WalMart computers processed over 1 million complex data queries.

The National Basketball Association (NBA) is exploring a data mining application that can be used in conjunction with image recordings of basketball games. The Advanced Scout software analyzes the movements of players to help coaches orchestrate plays and strategies. For example, an analysis of the play-by-play sheet of the game played between the New York Knicks and the Cleveland Cavaliers on January 6, 1995 reveals that when Mark Price played the Guard position, John Williams attempted four jump shots and made each one! Advanced Scout not only finds this pattern, but explains that it is interesting because it differs considerably from the average shooting percentage of 49.30% for the Cavaliers during that game.

By using the NBA universal clock, a coach can automatically bring up the video clips showing each of the jump shots attempted by Williams with Price on the floor, without needing to comb through hours of video footage. Those clips show a very successful pick-and-roll play in which Price draws the Knick's defense and then finds Williams for an open jump shot.

How does data mining work?

While large-scale information technology has been evolving separate transaction and analytical systems, data mining provides the link between the two. Data mining software analyzes relationships and patterns in stored transaction data based on open-ended user queries. Several types of analytical software are available: statistical, machine learning, and neural networks. Generally, any of four types of relationships are sought:

  • Classes: Stored data is used to locate data in predetermined groups. For example, a restaurant chain could mine customer purchase data to determine when customers visit and what they typically order. This information could be used to increase traffic by having daily specials.
  • Clusters: Data items are grouped according to logical relationships or consumer preferences. For example, data can be mined to identify market segments or consumer affinities.
  • Associations: Data can be mined to identify associations. The beer-diaper example is an example of associative mining.
  • Sequential patterns: Data is mined to anticipate behavior patterns and trends. For example, an outdoor equipment retailer could predict the likelihood of a backpack being purchased based on a consumer's purchase of sleeping bags and hiking shoes.

Data mining consists of five major elements:

  • Extract, transform, and load transaction data onto the data warehouse system.
  • Store and manage the data in a multidimensional database system.
  • Provide data access to business analysts and information technology professionals.
  • Analyze the data by application software.
  • Present the data in a useful format, such as a graph or table.

Different levels of analysis are available:

  • Artificial neural networks: Non-linear predictive models that learn through training and resemble biological neural networks in structure.
  • Genetic algorithms: Optimization techniques that use processes such as genetic combination, mutation, and natural selection in a design based on the concepts of natural evolution.
  • Decision trees: Tree-shaped structures that represent sets of decisions. These decisions generate rules for the classification of a dataset. Specific decision tree methods include Classification and Regression Trees (CART) and Chi Square Automatic Interaction Detection (CHAID) . CART and CHAID are decision tree techniques used for classification of a dataset. They provide a set of rules that you can apply to a new (unclassified) dataset to predict which records will have a given outcome. CART segments a dataset by creating 2-way splits while CHAID segments using chi square tests to create multi-way splits. CART typically requires less data preparation than CHAID.
  • Nearest neighbor method: A technique that classifies each record in a dataset based on a combination of the classes of the k record(s) most similar to it in a historical dataset (where k 1). Sometimes called the k-nearest neighbor technique.
  • Rule induction: The extraction of useful if-then rules from data based on statistical significance.
  • Data visualization: The visual interpretation of complex relationships in multidimensional data. Graphics tools are used to illustrate data relationships.

What technological infrastructure is required?

Today, data mining applications are available on all size systems for mainframe, client/server, and PC platforms. System prices range from several thousand dollars for the smallest applications up to $1 million a terabyte for the largest. Enterprise-wide applications generally range in size from 10 gigabytes to over 11 terabytes. NCR has the capacity to deliver applications exceeding 100 terabytes. There are two critical technological drivers:

  • Size of the database: the more data being processed and maintained, the more powerful the system required.
  • Query complexity: the more complex the queries and the greater the number of queries being processed, the more powerful the system required.

Relational database storage and management technology is adequate for many data mining applications less than 50 gigabytes. However, this infrastructure needs to be significantly enhanced to support larger applications. Some vendors have added extensive indexing capabilities to improve query performance. Others use new hardware architectures such as Massively Parallel Processors (MPP) to achieve order-of-magnitude improvements in query time. For example, MPP systems from NCR link hundreds of high-speed Pentium processors to achieve performance levels exceeding those of the largest supercomputers.

Thursday, October 18, 2007

Hello....

Let do it….

 

Regards,

Sankata
PT. Ecomindo Saranacipta
Gedung YDAP Denta Medika, 4th floor
Jl. Raya Pasar Minggu No. 45
Jakarta Selatan, Indonesia

Office : +6221 7900909

E-mail : sankata.ec@ecomindo.com
ym: sankatalee | gtalk: sankatalee

 

We will win this game

If we try the best from our heart, then we will win all the true of the world...

Welcome to Our blog


Hello Everyone,

Welcome to our blog, nice to meet you all. we'll try to publish the newest and fastest information and other articel with trusted source.

later we'll try to do and publish everythings that make everyone can gain knowledge to develop yourself.

regards,

Sankata & Andina